Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

Commit a1dc525

Browse files
author
Brian Strauch
authored
remove sr api key prompt (#1664)
1 parent 0943d94 commit a1dc525

File tree

8 files changed

+6
-40
lines changed

8 files changed

+6
-40
lines changed

‎_includes/shared/markup/ccloud/ccloud-sr-consume.adoc‎

Lines changed: 0 additions & 7 deletions
This file was deleted.
Lines changed: 0 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -1,9 +1 @@
1-
You will be prompted for the Confluent Cloud Schema Registry credentials as shown below, which you can find in the `configuration/ccloud.properties` configuration file.
2-
Look for the configuration parameter `basic.auth.user.info`, whereby the ":" is the delimiter between the key and secret.
3-
4-
```
5-
Enter your Schema Registry API key:
6-
Enter your Schema Registry API secret:
7-
```
8-
91
When the console producer starts, it will log some messages and hang, waiting for your input. Type in one line at a time and press enter to send it. Each line represents an event. To send all of the events below, paste the following into the prompt and press enter:
Lines changed: 2 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -1,18 +1,9 @@
1-
Next, let's open up a consumer to read records from the new topic.
1+
Next, let's open up a consumer to read records from the new topic.
22

3-
From the same terminal you used to create the topic above, run the following command to start a console consumer with the `ccloud` CLI:
3+
From the same terminal you used to create the topic above, run the following command to start a console consumer with the Confluent CLI:
44

55
+++++
66
<pre class="snippet"><code class="shell">{% include_raw tutorials/console-consumer-producer-avro/confluent/code/tutorial-steps/dev/harness-console-consumer-keys.sh %}</code></pre>
77
+++++
88

9-
You will be prompted for the Confluent Cloud Schema Registry credentials as shown below.
10-
Enter the values you got from when you enabled Schema Registry in the Confluent Cloud Console.
11-
12-
```
13-
Enter your Schema Registry API key:
14-
Enter your Schema Registry API secret:
15-
```
16-
179
The consumer will start up and block waiting for records, you won't see any output until after the next step.
18-

‎_includes/tutorials/creating-first-apache-kafka-streams-application/confluent/markup/dev/run-consumer.adoc‎

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
Now that the Kafka Streams application is running, run a command line consumer using the `ccloud` CLI to view the events (your `ccloud` context should be set to the proper environment, cluster, and API Key (see Step 4 above and https://docs.confluent.io/ccloud-cli/current/command-reference/index.html[Confluent CLI Reference] for additional details).
1+
Now that the Kafka Streams application is running, run a command line consumer using the Confluent CLI to view the events (your `confluent` context should be set to the proper environment, cluster, and API Key (see Step 4 above and https://docs.confluent.io/confluent-cli/current/command-reference/overview.html[Confluent CLI Reference] for additional details).
22

33
Then, in a new terminal window, run the following console consumer to view the events being generated by the data generator and produced to the `random-strings` topic from the `Randomizer` class in your Kafka Streams application. These are the events that have been streamed into the topology (`.stream(inputTopic, Consumed.with(stringSerde, stringSerde)`).
44

‎_includes/tutorials/finding-distinct/confluent/markup/dev/ccloud-run-produce.adoc‎

Lines changed: 1 addition & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -4,14 +4,6 @@ In a new terminal window, run the following command to start a Confluent CLI pro
44
<pre class="snippet"><code class="bash">{% include_raw tutorials/finding-distinct/confluent/code/tutorial-steps/dev/ccloud-produce-events.sh %}</code></pre>
55
+++++
66

7-
You will be prompted for the Confluent Cloud Schema Registry credentials as shown below, which you can find in the `configuration/ccloud.properties` configuration file.
8-
Look for the configuration parameter `basic.auth.user.info`, whereby the ":" is the delimiter between the key and secret.
9-
10-
```
11-
Enter your Schema Registry API key:
12-
Enter your Schema Registry API secret:
13-
```
14-
157
When the producer starts, it will log some messages and hang, waiting for your input. Each line represents input data for the Kafka Streams application.
168
To send all of the events below, paste the following into the prompt and press enter:
179

@@ -21,4 +13,4 @@ To send all of the events below, paste the following into the prompt and press e
2113

2214
Enter `Ctrl-C` to exit.
2315

24-
In the next steps we will run a consumer to observe the distinct click events. You can experiment with various orderings of the records in order to observe what makes a click event distinct. By default the distinct event window store looks for distinct clicks over a 2-minute duration.
16+
In the next steps we will run a consumer to observe the distinct click events. You can experiment with various orderings of the records in order to observe what makes a click event distinct. By default the distinct event window store looks for distinct clicks over a 2-minute duration.

‎_includes/tutorials/joining-stream-table/confluent/markup/dev/ccloud-produce-movies.adoc‎

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -10,4 +10,4 @@ include::_includes/shared/markup/ccloud/ccloud-sr-produce.adoc[]
1010
<pre class="snippet"><code class="json">{% include_raw tutorials/joining-stream-table/kstreams/code/tutorial-steps/dev/movies.json %}</code></pre>
1111
+++++
1212

13-
In this case the table data originates from a Kafka topic that was populated by a console producer using `ccloud` CLI but this doesn't always have to be the case. You can use Kafka Connect to stream data from a source system (such as a database) into a Kafka topic, which could then be the foundation for a lookup table. For further reading checkout this tutorial on link:{{ "connect-add-key-to-source/kstreams.html" | relative_url }}[creating a Kafka Streams table from SQLite data using Kafka Connect].
13+
In this case the table data originates from a Kafka topic that was populated by a console producer using the Confluent CLI but this doesn't always have to be the case. You can use Kafka Connect to stream data from a source system (such as a database) into a Kafka topic, which could then be the foundation for a lookup table. For further reading checkout this tutorial on link:{{ "connect-add-key-to-source/kstreams.html" | relative_url }}[creating a Kafka Streams table from SQLite data using Kafka Connect].

‎_includes/tutorials/kafka-connect-datagen/confluent/markup/dev/check-connector.adoc‎

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
To check the status of the connector from the command line, you have the same two options as provisioning.
22

3-
*Option 1.* Using the `ccloud` CLI.
3+
*Option 1.* Using the Confluent CLI.
44

55
+++++
66
<pre class="snippet"><code class="shell">{% include_raw tutorials/kafka-connect-datagen/confluent/code/tutorial-steps/dev/check-connector.sh %}</code></pre>

‎_includes/tutorials/session-windows/confluent/markup/dev/ccloud-run-consumer.adoc‎

Lines changed: 0 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -4,8 +4,6 @@ Now that your Kafka Streams application is running, open a new terminal window,
44
confluent kafka topic consume output-topic --from-beginning --print-key
55
```
66

7-
include::_includes/shared/markup/ccloud/ccloud-sr-consume.adoc[]
8-
97
Your results should look something like this:
108
++++
119
<pre class="snippet"><code class="shell">

0 commit comments

Comments
(0)

AltStyle によって変換されたページ (->オリジナル) /