When comparing Snowplow numbers against GA, we recommend starting by looking at page views by page URL. There is very little business logic associated with recording a page view, so these numbers should typically agree very closely, with Snowplow reporting higher numbers because we don’t remove bots from the list. (Note that GA will remove many more bots than are identified by the user agent parsing libraries that are available with Snowplow. The size of the discrepancy then reflects how much bot traffic your site attracts - we see it varying between 3 and 15% for e.g. jobs boards and other sites that attract crawlers.)
Often carrying out the above step throws up differences in tracking implementation between GA and Snowplow. (E.g. pages that are missed with one and not the other.)
If those two numbers agree then explore the difference in unique visitors by page. Both Google and Snowplow primarily base this number on a first party cookie, so again our expectation is that they should be pretty close, with Snowplow reporting higher numbers because of bots.
In general we recommend avoiding comparing session numbers. The GA sessionization logic is very specific (and advertiser-friendly) - the sessionization that the Snowplow JS tracker supports out of the box follows the Adobe simple 30 minute timeout model. If you must compare session numbers we have a guide (incl. SQL):
http://discourse.snowplow.io/t/reconciling-snowplow-and-google-analytics-session-numbers/80
So to summarise: my guess is that an implementation difference accounts for the very large discrepancy you’ve seen - unless you’re a site that attracts a lot of bots. (In which case - how do you filter these out when comparing?) I take it that your Snowplow numbers are higher, which makes me wonder if your GA coverage isn’t 100%? This should become clear once you start slicing the numbers by URL. It would also be worth understanding if you’ve done anything on the GA side to cusomize how uniques are identified (e.g. passing in your own user identifiers)?