Create an account

Very important

  • To access the important data of the forums, you must be active in each forum and especially in the leaks and database leaks section, send data and after sending the data and activity, data and important content will be opened and visible for you.
  • You will only see chat messages from people who are at or below your level.
  • More than 500,000 database leaks and millions of account leaks are waiting for you, so access and view with more activity.
  • Many important data are inactive and inaccessible for you, so open them with activity. (This will be done automatically)


Thread Rating:
  • 498 Vote(s) - 3.46 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Cannot connect to Cassandra from Spark (Contact points contain multiple data centers)

#1
I am trying to run my first spark job (a Scala job that accesses Cassandra) which is failing and showing the following error :

java.io.IOException: Failed to open native connection to Cassandra at {<ip>}:9042
at com.datastax.spark.connector.cql.CassandraConnector$.com$datastax$spark$connector$cql$CassandraConnector$$createSession(CassandraConnector.scala:164)
at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$2.apply(CassandraConnector.scala:150)
at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$2.apply(CassandraConnector.scala:150)
at com.datastax.spark.connector.cql.RefCountedCache.createNewValueAndKeys(RefCountedCache.scala:31)
...........
............
Caused by: java.lang.IllegalArgumentException: Contact points contain multiple data centers:
at com.datastax.spark.connector.cql.LocalNodeFirstLoadBalancingPolicy.init(LocalNodeFirstLoadBalancingPolicy.scala:47)
at com.datastax.driver.core.Cluster$Manager.init(Cluster.java:1099)
at com.datastax.driver.core.Cluster.getMetadata(Cluster.java:271)
at com.datastax.spark.connector.cql.CassandraConnector$.com$datastax$spark$connector$cql$CassandraConnector$$createSession(CassandraConnector.scala:157)

What are we doing wrong here?

I am using :

- Spark 1.5.2
- Apache Cassandra 2.1.10
- [spark-cassandra connector 1.3.1][1] /1.5.0-M2 (tried both connectors)
- Scala version 2.10.4

[1]:

[To see links please register here]

Reply

#2
--> According to author there a work in progress to fix this. See comments below this answer.

I found this in the documentation, I hope it will help you :

override def init(cluster: Cluster, hosts: JCollection[Host]) {
nodes = hosts.toSet
// use explicitly set DC if available, otherwise see if all contact points have same DC
// if so, use that DC; if not, throw an error
dcToUse = localDC match {
case Some(local) => local
case None =>
val dcList = dcs(nodesInTheSameDC(contactPoints, hosts.toSet))
if (dcList.size == 1)
dcList.head
else
throw new IllegalArgumentException(s"Contact points contain multiple data centers: ${dcList.mkString(", ")}")
}
clusterMetadata = cluster.getMetadata
}
Reply

#3
I was facing the same issue while trying to connect two Cassandra data center using Apache Spark 2.x.x.

public class SparkCassandraTest {
private static final String CASSANDRA_ENDPOINTS = "DC1_node1,DC1_node2,DC1_node3,DC2_node1,DC2_node2,DC2_node3";

public static void main(String[] args) {
sparkConf = new SparkConf().setAppName(APP_NAME);
sparkConf.set("spark.cassandra.connection.host", CASSANDRA_ENDPOINTS);
sparkConf.set("spark.cassandra.auth.username", CASSANDRA_USERNAME);
sparkConf.set("spark.cassandra.auth.password", CASSANDRA_PASSWORD);
sparkSession = SparkSession.builder().config(sparkConf).enableHiveSupport().getOrCreate();

//.....................
//.....................
//.....................
}
}




> Caused by: java.lang.IllegalArgumentException: requirement failed: Contact points contain multiple data centers: DC2-XXXXX2, DC1-XXXXX1

I resolve this issue by connecting any one Cassandra data center (**DC1_node1**,**DC1_node2**,**DC1_node3**) or (**DC2_node1**,**DC2_node2**,**DC2_node3**).
Reply



Forum Jump:


Users browsing this thread:
1 Guest(s)

©0Day  2016 - 2023 | All Rights Reserved.  Made with    for the community. Connected through