Bigquery-loader-cli --Exception in thread "main" java.util.NoSuchElementException

I am following documentation and got success to create a table, but getting error while trying to upload data in same table.

Below error I am getting

WARNING: Application name is not set. Call Builder#setApplicationName.
Exception in thread "main" java.util.NoSuchElementException: None.get
	at scala.None$.get(Option.scala:313)
	at scala.None$.get(Option.scala:311)
	at com.snowplowanalytics.snowplow.bigquery.loader.BigqueryLoaderCli$.main(BigqueryLoaderCli.scala:73)
	at com.snowplowanalytics.snowplow.bigquery.loader.BigqueryLoaderCli.main(BigqueryLoaderCli.scala)

And I have tried multiple command to make it run and all command give same above error

  1. java -jar bigquery-loader-cli-0.1.0 snowplow-project-id atomic events ./run=2016-08-16-09-22-28-./part-00000
  2. java -jar bigquery-loader-cli-0.1.0 snowplow-project-id atomic events ./run=2016-08-16-09-22-28-./

Is anything I missed here? Please suggest.

can anyone help me to solve this error?

That BigQuery loader is not actively supported by Snowplow, but based on that error, you’re missing the project ID:

I have followed this format

java -jar bigquery-loader-cli-0.1.0 < projectId > < datasetId > < tableId > < dataLocation >

Here what I have passed

java -jar bigquery-loader-cli-0.1.0 snowplow-project-id atomic events ./run=2016-08-16-09-22-28-./part-00000

So, project id is already there. Any other thing I missed?

Well, I removed old dataset and created another dataset. This time I got different error.

Dataset created
Table created
Exception in thread “main” java.lang.Error: There seems to have been a parsing error
at com.snowplowanalytics.snowplow.bigquery.loader.TsvParser$.getValues(TsvParser.scala:63)
at com.snowplowanalytics.snowplow.bigquery.loader.TsvParser$$anonfun$addFieldsToData$1.apply(TsvParser.scala:51)
at com.snowplowanalytics.snowplow.bigquery.loader.TsvParser$$anonfun$addFieldsToData$1.apply(TsvParser.scala:50)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
at scala.collection.immutable.List.foreach(List.scala:318)
at scala.collection.TraversableLike$
at com.snowplowanalytics.snowplow.bigquery.loader.TsvParser$.addFieldsToData(TsvParser.scala:50)
at com.snowplowanalytics.snowplow.bigquery.loader.BigqueryLoaderCli$.com$snowplowanalytics$snowplow$bigquery$loader$BigqueryLoaderCli$$sendBatch$1(BigqueryLoaderCli.scala:135)
at com.snowplowanalytics.snowplow.bigquery.loader.BigqueryLoaderCli$$anonfun$uploadData$1.apply(BigqueryLoaderCli.scala:152)
at com.snowplowanalytics.snowplow.bigquery.loader.BigqueryLoaderCli$$anonfun$uploadData$1.apply(BigqueryLoaderCli.scala:152)
at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:105)
at com.snowplowanalytics.snowplow.bigquery.loader.BigqueryLoaderCli$.uploadData(BigqueryLoaderCli.scala:152)
at com.snowplowanalytics.snowplow.bigquery.loader.BigqueryLoaderCli$.main(BigqueryLoaderCli.scala:75)
at com.snowplowanalytics.snowplow.bigquery.loader.BigqueryLoaderCli.main(BigqueryLoaderCli.scala)

When I googled it, it seems error is coming from exception.

// TODO: switch from throwing error to using scalaz Validation, maybe.
def getValues(line: String): List[String] = {
val values = line.split("\t", -1).toList
if (values.length != 108){
throw new Error(“There seems to have been a parsing error”)

@alex can you help me to understand what are the values that have parsing error?

Hi @mongodb - my suspicion is that BigQuery Loader CLI works against an older version of the Snowplow enriched event format (one with a different number of tabs).

Unfortunately this project was an R&D spike which we are not actively supporting (though we learnt a lot for future BigQuery support). I’ll update the GitHub tagline for the repo to make this clear.

For what it’s worth, here are the changes I had to do to support the latest enriched event format (as alex mentions):

But I quickly gave up on it because of other issues e.g. unstructured events, enrichment contexts etc that would mean continuously updating the format.