Relative Content

Tag Archive for apache-flink

Fresh installation of Apache Flink crushes with “Response was not a valid JSON” for hello-world examples

After a fresh installation of Apache Flink, that is, fetching the binary with all default setups, executing a simple hello-world example as listed in the tutorial page, using bin/flink run examples/batch/WordCount.jar crushes with “Response was not a valid JSON”. The same error was encountered for simple commands like flink list. The issue persists for flink 1.16.1, 1.18.1, 1.19.1, and 1.20.1, and for other hello-world examples in the binary distribution.

Fresh installation of Apache Flink crushes with “Response was not a valid JSON” for hello-world examples

After a fresh installation of Apache Flink, that is, fetching the binary with all default setups, executing a simple hello-world example as listed in the tutorial page, using bin/flink run examples/batch/WordCount.jar crushes with “Response was not a valid JSON”. The same error was encountered for simple commands like flink list. The issue persists for flink 1.16.1, 1.18.1, 1.19.1, and 1.20.1, and for other hello-world examples in the binary distribution.

Fresh installation of Apache Flink crushes with “Response was not a valid JSON” for hello-world examples

After a fresh installation of Apache Flink, that is, fetching the binary with all default setups, executing a simple hello-world example as listed in the tutorial page, using bin/flink run examples/batch/WordCount.jar crushes with “Response was not a valid JSON”. The same error was encountered for simple commands like flink list. The issue persists for flink 1.16.1, 1.18.1, 1.19.1, and 1.20.1, and for other hello-world examples in the binary distribution.

Fresh installation of Apache Flink crushes with “Response was not a valid JSON” for hello-world examples

After a fresh installation of Apache Flink, that is, fetching the binary with all default setups, executing a simple hello-world example as listed in the tutorial page, using bin/flink run examples/batch/WordCount.jar crushes with “Response was not a valid JSON”. The same error was encountered for simple commands like flink list. The issue persists for flink 1.16.1, 1.18.1, 1.19.1, and 1.20.1, and for other hello-world examples in the binary distribution.

Fresh installation of Apache Flink crushes with “Response was not a valid JSON” for hello-world examples

After a fresh installation of Apache Flink, that is, fetching the binary with all default setups, executing a simple hello-world example as listed in the tutorial page, using bin/flink run examples/batch/WordCount.jar crushes with “Response was not a valid JSON”. The same error was encountered for simple commands like flink list. The issue persists for flink 1.16.1, 1.18.1, 1.19.1, and 1.20.1, and for other hello-world examples in the binary distribution.

How to make two sinks work in a transaction in Flink

I have a Flink pipeline which has two sinks , one sink to publish messages to Kafka and other to update the status in database. I am following exactly once semantics using Kafka transaction and XA JDBC transaction. The issue which I am facing is when the database is down and Kafka broker is up , flink sends the messages to broker (it commits the messages , I verified that my consumer reads only committed messages) even when the DB wont come up after a set number of flink retries. My requirement is to send the message and update the corresponding record only once. How to achieve that.

Using Flink and Flink CDC reports an error org.apache.flink.shaded.guava31.com.google.common.util.concurrent.ThreadFactoryBuilder

When I used Flink CDC to migrate data to Oracle and SQL Server, the Flink version used was 1.16.0 and the Flink cdc version used was 3.2.0. Then flink-shaded-guava used 31.1-jre-17.0. The error org.apache.flink.shaded.guava30.com.google.common.collect.Lists was reported. But when I used 30.1.1-jre-15.0, the error org.apache.flink.shaded.guava31.com.google.common.util.concurrent.ThreadFactoryBuilder was reported. The specific information is as follows:

Using Flink and Flink CDC reports an error org.apache.flink.shaded.guava31.com.google.common.util.concurrent.ThreadFactoryBuilder

When I used Flink CDC to migrate data to Oracle and SQL Server, the Flink version used was 1.16.0 and the Flink cdc version used was 3.2.0. Then flink-shaded-guava used 31.1-jre-17.0. The error org.apache.flink.shaded.guava30.com.google.common.collect.Lists was reported. But when I used 30.1.1-jre-15.0, the error org.apache.flink.shaded.guava31.com.google.common.util.concurrent.ThreadFactoryBuilder was reported. The specific information is as follows: