Skip to content

Commit 8deb6c6

Browse files
MINOR: Remove SPAM URL in Streams Documentation (#20321)
The previous URL http://lambda-architecture.net/ seems to now be controlled by spammers Co-authored-by: Shashank <[email protected]> Reviewers: Mickael Maison <[email protected]>
1 parent ba97558 commit 8deb6c6

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

docs/streams/core-concepts.html

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -279,7 +279,7 @@ <h2 class="anchor-heading"><a id="streams_processing_guarantee" class="anchor-li
279279
<p>
280280
In stream processing, one of the most frequently asked question is "does my stream processing system guarantee that each record is processed once and only once, even if some failures are encountered in the middle of processing?"
281281
Failing to guarantee exactly-once stream processing is a deal-breaker for many applications that cannot tolerate any data-loss or data duplicates, and in that case a batch-oriented framework is usually used in addition
282-
to the stream processing pipeline, known as the <a href="http://lambda-architecture.net/">Lambda Architecture</a>.
282+
to the stream processing pipeline, known as the <a href="https://en.wikipedia.org/wiki/Lambda_architecture">Lambda Architecture</a>.
283283
Prior to 0.11.0.0, Kafka only provides at-least-once delivery guarantees and hence any stream processing systems that leverage it as the backend storage could not guarantee end-to-end exactly-once semantics.
284284
In fact, even for those stream processing systems that claim to support exactly-once processing, as long as they are reading from / writing to Kafka as the source / sink, their applications cannot actually guarantee that
285285
no duplicates will be generated throughout the pipeline.<br />

0 commit comments

Comments
 (0)