Skip to content

Release 1.2.0

Compare
Choose a tag to compare
@pkolaczk pkolaczk released this 04 May 20:33
· 2293 commits to master since this release

1.2.0

  • Removed conversion method rom WriteOption which accepted object of Duration type
    from Spark Streaming (SPARKC-106)
  • Fixed compilation warnings (SPARKC-76)
  • Fixed ScalaDoc warnings (SPARKC-119)
  • Synchronized TypeTag access in various places (SPARKC-123)
  • Adds both hostname and hostaddress as partition preferredLocations (SPARKC-126)

1.2.0 RC 3

  • Select aliases are no longer ignored in CassandraRow objects (SPARKC-109)
  • Fix picking up username and password from SparkConf (SPARKC-108)
  • Fix creating CassandraConnectorSource in the executor environment (SPARKC-111)

1.2.0 RC 2

  • Cross cluster table join and write for Spark SQL (SPARKC-73)
  • Enabling / disabling metrics in metrics configuration file and other metrics fixes (SPARKC-91)
  • Provided a way to set custom auth config and connection factory properties (SPARKC-105)
  • Fixed setting custom connection factory and other properties (SPAKRC-102)
  • Fixed Java API (SPARKC-95)

1.2.0 RC 1

  • More Spark SQL predicate push (SPARKC-72)
  • Fixed some Java API problems and refactored its internals (SPARKC-77)
  • Allowing specification of column to property map (aliases) for reading and writing objects
    (SPARKC-9)
  • Added interface for doing primary key joins between arbitrary RDDs and Cassandra (SPARKC-25)
  • Added method for repartitioning an RDD based upon the replication of a Cassandra Table (SPARKC-25)
  • Fixed setting batch.level and batch.buffer.size in SparkConf. (SPARKC-84)
    • Renamed output.batch.level to output.batch.grouping.key.
    • Renamed output.batch.buffer.size to output.batch.grouping.buffer.size.
    • Renamed batch grouping key option "all" to "none".
  • Error out on invalid config properties (SPARKC-90)
  • Set Java driver version to 2.1.5 and Cassandra to 2.1.3 (SPARKC-92)
  • Moved Spark streaming related methods from CassandraJavaUtil to CassandraStreamingJavaUtil
    (SPARKC-80)

1.2.0 alpha 3

  • Exposed spanBy and spanByKey in Java API (SPARKC-39)
  • Added automatic generation of Cassandra table schema from a Scala type and
    saving an RDD to a new Cassandra table by saveAsCassandraTable method (SPARKC-38)
  • Added support for write throughput limiting (SPARKC-57)
  • Added EmptyCassandraRDD (SPARKC-37)
  • Exposed authConf in CassandraConnector
  • Overridden count() implementation in CassandraRDD which uses native Cassandra count (SPARKC-52)
  • Removed custom Logging class (SPARKC-54)
  • Added support for passing the limit clause to CQL in order to fetch top n results (SPARKC-31)
  • Added support for pushing down order by clause for explicitly specifying an order of rows within
    Cassandra partition (SPARKC-32)
  • Fixed problems when rows are mapped to classes with inherited fields (SPARKC-70)
  • Support for compiling with Scala 2.10 and 2.11 (SPARKC-22)

1.2.0 alpha 2

  • All connection properties can be set on SparkConf / CassandraConnectorConf objects and
    the settings are automatically distributed to Spark Executors (SPARKC-28)
  • Report Connector metrics to Spark metrics system (SPARKC-27)
  • Upgraded to Spark 1.2.1 (SPARKC-30)
  • Add conversion from java.util.Date to java.sqlTimestamp for Spark SQL (#512)
  • Upgraded to Scala 2.11 and scala version cross build (SPARKC-22)

1.2.0 alpha 1

  • Added support for TTL and timestamp in the writer (#153)
  • Added support for UDT column types (SPARKC-1)
  • Upgraded Spark to version 1.2.0 (SPARKC-15)
  • For 1.2.0 release, table name with dot is not supported for Spark SQL,
    it will be fixed in the next release
  • Added fast spanBy and spanByKey methods to RDDs useful for grouping Cassandra
    data by partition key / clustering columns. Useful for e.g. time-series data. (SPARKC-2)
  • Refactored the write path so that the writes are now token-aware (S