Skip to content

Commit b010437

Browse files
docs: add changelog entry for version 3.14.0
1 parent dd57d3f commit b010437

5 files changed

Lines changed: 238 additions & 153 deletions

File tree

archipy/adapters/kafka/adapters.py

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -521,11 +521,12 @@ def produce(self, message: str | bytes, key: str | None = None) -> None:
521521
"""
522522
try:
523523
processed_message = self._pre_process_message(message)
524+
processed_key = self._pre_process_message(key)
524525
self._adapter.produce(
525526
topic=self._topic_name,
526527
value=processed_message,
527528
callback=self._delivery_callback,
528-
key=key,
529+
key=processed_key,
529530
)
530531
except Exception as e:
531532
self._handle_producer_exception(e, "produce")

docs/changelog.md

Lines changed: 63 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -2,24 +2,85 @@
22

33
All notable changes to ArchiPy are documented in this changelog, organized by version.
44

5+
## [3.14.0] - 2025-10-26
6+
7+
### Added
8+
9+
#### gRPC Application Creation Utilities
10+
11+
- **gRPC App Creation** - Added comprehensive gRPC application creation utilities for both sync and async servers
12+
- Added `AppUtils.create_async_grpc_app()` method for async gRPC server creation with interceptor support
13+
- Added `AppUtils.create_grpc_app()` method for synchronous gRPC server creation
14+
- Implemented automatic setup of exception, tracing, and metric interceptors
15+
- Added `GrpcAPIUtils` class with setup methods for trace and metric interceptors for sync gRPC servers
16+
- Added `AsyncGrpcAPIUtils` class with setup methods for trace and metric interceptors for async gRPC servers
17+
- Integrated Prometheus metric collection with configurable HTTP server port
18+
- Enhanced optional import handling for gRPC dependencies with proper graceful degradation
19+
- Configured ThreadPoolExecutor with configurable worker count and server options
20+
- Support for custom interceptors and compression settings
21+
22+
#### Prometheus Metrics Support
23+
24+
- **Metric Collection** - Added Prometheus metrics integration for gRPC servers
25+
- Automatic metric interceptor setup when Prometheus is enabled in configuration
26+
- Configurable HTTP server for metrics endpoint exposure
27+
- Integrated metric collection for both sync and async gRPC servers
28+
- Enhanced observability with automatic Prometheus client initialization
29+
30+
### Changed
31+
32+
#### Kafka Producer Enhancements
33+
34+
- **Key Parameter Support** - Enhanced Kafka producer with proper key encoding support
35+
- Added optional `key` parameter to `KafkaProducerPort.produce()` method signature
36+
- Implemented proper UTF-8 encoding for message keys using `_pre_process_message()` helper
37+
- Ensures consistent handling of both string and bytes keys in message production
38+
- Improved key/value consistency in Kafka message production workflow
39+
40+
#### Cache Decorator Optimization
41+
42+
- **Lazy Import Optimization** - Optimized TTL cache decorator import strategy
43+
- Moved `cachetools.TTLCache` import inside the decorator function to prevent global import issues
44+
- Improved module initialization performance by avoiding heavy dependencies at import time
45+
- Maintained backward compatibility while improving startup time
46+
- Enhanced import cleanliness and reduced initialization overhead
47+
48+
### Fixed
49+
50+
#### Kafka Producer Key Processing
51+
52+
- **Key Encoding Fix** - Fixed issue where message keys were not being properly processed
53+
- Applied `_pre_process_message()` to key parameter in `produce()` method for proper encoding
54+
- Corrected key handling to match message value processing behavior
55+
- Resolved potential encoding errors when using string keys in Kafka message production
56+
- Enhanced BDD test coverage with proper key verification scenarios
57+
58+
#### Import Cleanup
59+
60+
- **Module Organization** - Improved import structure across multiple modules
61+
- Fixed unnecessary imports in Keycloak and MinIO adapters
62+
- Enhanced import cleanup in decorators module
63+
- Improved code organization and reduced import overhead
64+
565
## [3.13.10] - 2025-10-20
666

767
### Changed
868

969
#### Dependency Updates
1070

11-
- **Comprehensive Dependency Synchronization** - Updated multiple core dependencies to latest versions for improved security and performance
71+
- **Comprehensive Dependency Synchronization** - Updated multiple core dependencies to latest versions for improved
72+
security and performance
1273
- Updated aiohttp from 3.13.0 to 3.13.1 for enhanced async HTTP client capabilities and bug fixes
1374
- Updated cryptography from 46.0.2 to 46.0.3 for improved cryptographic security and performance
1475
- Updated elastic-transport from 9.1.0 to 9.2.0 for enhanced Elasticsearch connectivity and reliability
1576
- Updated mkdocs-material from 9.6.21 to 9.6.22 for improved documentation rendering and Material theme features
77+
- Updated mkdocs-material from 9.6.21 to 9.6.22 for improved documentation rendering and Material theme features
1678
- Updated protobuf from 6.32.1 to 6.33.0 for enhanced Protocol Buffers support and performance
1779
- Updated pydantic from 2.12.2 to 2.12.3 for improved data validation and type safety
1880
- Updated pytokens from 0.1.10 to 0.2.0 for enhanced token processing capabilities
1981
- Updated ruff from 0.14.0 to 0.14.1 for improved linting capabilities and bug fixes
2082
- Updated wrapt from 1.17.3 to 2.0.0 for enhanced function wrapping capabilities
2183

22-
2384
## [3.13.9] - 2025-10-15
2485

2586
### Improved

features/kafka_adapters.feature

Lines changed: 6 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -17,18 +17,19 @@ Feature: Kafka Adapter Operations Testing
1717
Given a Kafka producer for topic "test-topic"
1818
And a Kafka consumer subscribed to topic "test-topic" with group "test-group"
1919
When I produce a message "Hello Kafka" to topic "test-topic"
20-
Then the consumer should receive message "Hello Kafka" from topic "test-topic"
20+
Then the consumer should receive message "Hello Kafka" from topic "test-topic" with group "test-group"
2121

2222
Scenario: Validate producer health
2323
Given a Kafka producer for topic "test-topic"
2424
When I validate the producer health
2525
Then the producer health check should pass
2626

2727
Scenario: Produce message with additional parameters
28-
Given a Kafka producer for topic "test-topic"
29-
And a Kafka consumer subscribed to topic "test-topic" with group "test-group"
30-
When I produce one message "Hello Kafka with key" with key "test-key" to topic "test-topic"
31-
Then the consumer should receive message "Hello Kafka with key" from topic "test-topic"
28+
Given a test topic named "test-topic2"
29+
And a Kafka producer for topic "test-topic2"
30+
And a Kafka consumer subscribed to topic "test-topic2" with group "test-group2"
31+
When I produce one message "Hello Kafka with key" with key "test-key" to topic "test-topic2"
32+
Then the consumer should receive message "Hello Kafka with key" from topic "test-topic2" with group "test-group2"
3233

3334
Scenario: Delete a topic
3435
Given a topic named "test-topic-deletable" exists

features/steps/kafka_adapter_steps.py

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -226,9 +226,9 @@ def step_topic_list_includes(context, topic_name):
226226
raise AssertionError(f"Topic '{topic_name}' not in topic list after retries")
227227

228228

229-
@then('the consumer should receive message "{expected_message}" from topic "{topic_name}"')
230-
def step_consumer_receive(context, expected_message, topic_name):
231-
adapter = get_kafka_consumer_adapter(context, topic_name, "test-group")
229+
@then('the consumer should receive message "{expected_message}" from topic "{topic_name}" with group "{group_id}"')
230+
def step_consumer_receive(context, expected_message, topic_name, group_id):
231+
adapter = get_kafka_consumer_adapter(context, topic_name, group_id)
232232
try:
233233
messages = adapter.batch_consume(messages_number=1, timeout=2)
234234
assert len(messages) > 0, "No messages received"

0 commit comments

Comments
 (0)