-
Notifications
You must be signed in to change notification settings - Fork 6
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Write protocol should be Apache Arrow #49
base: main
Are you sure you want to change the base?
Conversation
@@ -0,0 +1,1171 @@ | |||
package tech.ytsaurus.client; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Will this class be moved to ytsaurus-client? Otherwise it should be moved to data-source/src/main/java
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It will be moved, yes.
import tech.ytsaurus.core.tables.ColumnValueType | ||
import tech.ytsaurus.spyt.serializers.SchemaConverter.MetadataFields | ||
import tech.ytsaurus.spyt.serializers.YsonRowConverter.{isNull, serializeValue} | ||
import tech.ytsaurus.spyt.serializers.YtLogicalType.Binary.tiType |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
A lot of imports of tiType
can be confusing. I think we can import tech.ytsaurus.spyt.serializers.YtLogicalType._
and use tiType prefixed with its type.
import scala.concurrent.duration.Duration | ||
import scala.concurrent.{ExecutionContext, Future} | ||
|
||
class InternalRowSerializer(schema: StructType, writeSchemaConverter: WriteSchemaConverter) extends WireRowSerializer[InternalRow] with LogLazy { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think we should preserve this old implementation and have a configuration flag which will allow us to switch between this and new write implementation to preserve backward compatibility. Moreover, I think in the next release the old implementation should be used by default and the arrow implementation will be considered as experimental and turned on explicitly.
import tech.ytsaurus.spyt.types.YTsaurusTypes.instance.sparkTypeFor | ||
|
||
case object Null extends AtomicYtLogicalType("null", 0x02, ColumnValueType.NULL, TiType.nullType(), NullType) | ||
case object Null extends AtomicYtLogicalType("null", 0x02, ColumnValueType.NULL, TiType.nullType(), NullType) { | ||
override def ytGettersFromList(ytGetter: InternalRowYTGetters): ytGetter.FromList = new ytGetter.FromListToNull { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can we have an abstract implementations of YtGetters so we don't need to implement it in every case object? Consider adding extra parameters to AtomicYtLogicalType
@alextokarew has imported your pull request. If you are a Yandex employee, you can view this diff. |
clean up imports arrow_write_enabled config
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please add some tests so we can ensure that arrow writing works correctly.
)) | ||
}.toSeq.asJava | ||
) | ||
} else |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'd add curly braces for else clause so it would be more explicit where new WriteSerializationContext
belongs
❌ This pull request has been discarded. |
Summary:
YTGetters<Row, List, Dict>
is a new interface, which is going to be pushed to the YT-client. It works as a typeclass, while implementations of its inner abstract classes are responsible for adapting rows to the YT types. The typeclass thing did not work out so well, so I will probably rewrite this by pushing the generic parameters down to each abstract class and making them static.ArrowTableRowsSerializer
is an implementation of writing YT-data via Apache Arrow protocol. The data is being received using the YT-getters. This is also supposed to be moved to the YT-client at some point.TableWriterBaseImpl
is copied from the YT-client to shadow the original class and patched to extend it with new functionality. Also is supposed to be moved to the YT-client.YTLogicalType
interfaces are extended with methods providingYTGetters
implementations, mapping Spark data to YT. This part is supposed to stay in our repository.All tests in
ComplexTypeV3Test
, except the ones about the nested dates, are working. We will need to wait for a fix in YT-storage for that – https://github.com/ytsaurus/ytsaurus/pull/942/files