Snowplow etl emr issue

Hi,

Snowplow ETL EMR issue :ETL EMR Runner job failed.

Please find the below error log of EMR shredded step. Due to below error, shredding step of EMR was failing.
{“type”:“MAP_ATTEMPT_FAILED”,“event”:{“org.apache.hadoop.mapreduce.jobhistory.TaskAttemptUnsuccessfulCompletion”:{“taskid”:“task_1566974514326_0009_m_000121”,“taskType”:“MAP”,“attemptId”:“attempt_1566974514326_0009_m_000121_3”,“finishTime”:1566977103147,“hostname”:“ip-10-1-4-102.ec2.internal”,“port”:8041,“rackname”:"/default-rack",“status”:“FAILED”,“error”:“Error: cascading.flow.FlowException: internal error during mapper execution\n\tat cascading.flow.hadoop.FlowMapper.run(FlowMapper.java:148)\n\tat org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:455)\n\tat org.apache.hadoop.mapred.MapTask.run(MapTask.java:344)\n\tat org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:172)\n\tat java.security.AccessController.doPrivileged(Native Method)\n\tat javax.security.auth.Subject.doAs(Subject.java:415)\n\tat org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)\n\tat org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:166)\nCaused by: cascading.flow.stream.DuctException: failure resolving tuple entry\n\tat cascading.flow.stream.TrapHandler.handleException(TrapHandler.java:198)\n\tat cascading.flow.stream.TrapHandler.handleException(TrapHandler.java:167)\n\tat cascading.flow.stream.ElementStage.handleException(ElementStage.java:145)\n\tat cascading.flow.stream.SourceStage.map(SourceStage.java:93)\n\tat cascading.flow.stream.SourceStage.run(SourceStage.java:58)\n\tat cascading.flow.hadoop.FlowMapper.run(FlowMapper.java:130)\n\t… 7 more\nCaused by: cascading.tuple.TupleException: unable to read from input identifier: hdfs://ip-10-1-4-47.ec2.internal:8020/mnt/var/lib/hadoop/tmp/7985433821_com_twitter_scalding_Mult_F99FA497954A490FAC82C31ADED1402A/part-00057\n\tat cascading.tuple.TupleEntrySchemeIterator.hasNext(TupleEntrySchemeIterator.java:152)\n\tat cascading.flow.stream.SourceStage.map(SourceStage.java:76)\n\t… 9 more\nCaused by: com.esotericsoftware.kryo.KryoException: Encountered unregistered class ID: 14257\nSerialization trace:\n_children (com.fasterxml.jackson.databind.node.ObjectNode)\n_children (com.fasterxml.jackson.databind.node.ObjectNode)\na (scalaz.Success)\n\tat com.esotericsoftware.kryo.util.DefaultClassResolver.readClass(DefaultClassResolver.java:119)\n\tat com.esotericsoftware.kryo.Kryo.readClass(Kryo.java:610)\n\tat com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:721)\n\tat com.esotericsoftware.kryo.serializers.MapSerializer.read(MapSerializer.java:134)\n\tat com.esotericsoftware.kryo.serializers.MapSerializer.read(MapSerializer.java:17)\n\tat com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:651)\n\tat com.esotericsoftware.kryo.serializers.FieldSerializer$ObjectField.read(FieldSerializer.java:605)\n\tat com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:221)\n\tat com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:732)\n\tat com.esotericsoftware.kryo.serializers.MapSerializer.read(MapSerializer.java:134)\n\tat com.esotericsoftware.kryo.serializers.MapSerializer.read(MapSerializer.java:17)\n\tat com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:651)\n\tat com.esotericsoftware.kryo.serializers.FieldSerializer$ObjectField.read(FieldSerializer.java:605)\n\tat com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:221)\n\tat com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:732)\n\tat com.twitter.chill.Tuple2Serializer.read(TupleSerializers.scala:42)\n\tat com.twitter.chill.Tuple2Serializer.read(TupleSerializers.scala:33)\n\tat com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:732)\n\tat com.twitter.chill.TraversableSerializer.read(Traversable.scala:43)\n\tat com.twitter.chill.TraversableSerializer.read(Traversable.scala:21)\n\tat com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:732)\n\tat com.twitter.chill.Tuple3Serializer.read(TupleSerializers.scala:56)\n\tat com.twitter.chill.Tuple3Serializer.read(TupleSerializers.scala:45)\n\tat com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:651)\n\tat com.esotericsoftware.kryo.serializers.FieldSerializer$ObjectField.read(FieldSerializer.java:605)\n\tat com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:221)\n\tat com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:629)\n\tat com.twitter.chill.SerDeState.readObject(SerDeState.java:58)\n\tat com.twitter.chill.KryoPool.fromBytes(KryoPool.java:105)\n\tat com.twitter.chill.hadoop.KryoDeserializer.deserialize(KryoDeserializer.java:51)\n\tat cascading.tuple.hadoop.TupleSerialization$SerializationElementReader.read(TupleSerialization.java:628)\n\tat cascading.tuple.hadoop.io.HadoopTupleInputStream.readType(HadoopTupleInputStream.java:105)\n\tat cascading.tuple.hadoop.io.HadoopTupleInputStream.getNextElement(HadoopTupleInputStream.java:52)\n\tat cascading.tuple.io.TupleInputStream.readTuple(TupleInputStream.java:78)\n\tat cascading.tuple.hadoop.io.TupleDeserializer.deserialize(TupleDeserializer.java:40)\n\tat cascading.tuple.hadoop.io.TupleDeserializer.deserialize(TupleDeserializer.java:28)\n\tat org.apache.hadoop.io.SequenceFile$Reader.deserializeValue(SequenceFile.java:2332)\n\tat org.apache.hadoop.io.SequenceFile$Reader.getCurrentValue(SequenceFile.java:2305)\n\tat org.apache.hadoop.mapred.SequenceFileRecordReader.getCurrentValue(SequenceFileRecordReader.java:109)\n\tat org.apache.hadoop.mapred.SequenceFileRecordReader.next(SequenceFileRecordReader.java:84)\n\tat org.apache.hadoop.mapred.MapTask$TrackedRecordReader.moveToNext(MapTask.java:200)\n\tat org.apache.hadoop.mapred.MapTask$TrackedRecordReader.next(MapTask.java:186)\n\tat cascading.tap.hadoop.util.MeasuredRecordReader.next(MeasuredRecordReader.java:61)\n\tat cascading.scheme.hadoop.SequenceFile.source(SequenceFile.java:93)\n\tat cascading.tuple.TupleEntrySchemeIterator.getNext(TupleEntrySchemeIterator.java:166)\n\tat cascading.tuple.TupleEntrySchemeIterator.hasNext(TupleEntrySchemeIterator.java:139)\n\t… 10 more\n”,“counters”:{“org.apache.hadoop.mapreduce.jobhistory.JhCounters”:{“name”:“COUNTERS”,“groups”:[{“name”:“org.apache.hadoop.mapreduce.FileSystemCounter”,“displayName”:“File System Counters”,“counts”:[{“name”:“FILE_BYTES_READ”,“displayName”:"FILE: Number of bytes .

Can anyone please help me on this.Thanks in advance.