Search
Calendar
August 2025
S M T W T F S
« Jul    
 12
3456789
10111213141516
17181920212223
24252627282930
31  
Archives

Posts Tagged ‘Java’

PostHeaderIcon A Tricky Java Question

Here’s a super tricky Java interview question that messes with developer intuition:

❓ Weird Question:

“What will be printed when executing the following code?”

import java.util.*;
public class TrickyJava {
 public static void main(String[] args) {
 List list = Arrays.asList("T-Rex", "Velociraptor", "Dilophosaurus");
 list.replaceAll(s -> s.toUpperCase());
 System.out.println(list);
 }
 }

The Trap:

At first glance, everything looks normal:

Arrays.asList(...) creates a List.
replaceAll(...) is a method in List that modifies elements using a function.
Strings are converted to uppercase.
Most developers will expect this output:

[T-REX, VELOCIRAPTOR, DILOPHOSAURUS]

But surprise! This code sometimes throws an UnsupportedOperationException.

 

✅ Correct Answer:

The output depends on the JVM implementation!

It might work and print:

[T-REX, VELOCIRAPTOR, DILOPHOSAURUS]

Or it might crash with:

Exception in thread "main" java.lang.UnsupportedOperationException
at java.util.AbstractList$Itr.remove(AbstractList.java:572)
at java.util.AbstractList.remove(AbstractList.java:212)
at java.util.AbstractList$ListItr.remove(AbstractList.java:582)
at java.util.List.replaceAll(List.java:500)

Why?

Arrays.asList(...) does not return a regular ArrayList, but rather a fixed-size list backed by an array.
The replaceAll(...) method attempts to modify the list in-place, which is not allowed for a fixed-size list.
Some JVM implementations optimize this internally, making it work, but it is not guaranteed to succeed.

Key Takeaways

Arrays.asList(...) returns a fixed-size list, not a modifiable ArrayList.
Modifying it directly (e.g., add(), remove(), replaceAll()) can fail with UnsupportedOperationException.
Behavior depends on the JVM implementation and internal optimizations.

How to Fix It?

To ensure safe modification, wrap the list in a mutable ArrayList:

List list = new ArrayList<>(Arrays.asList("T-Rex", "Velociraptor", "Dilophosaurus"));
list.replaceAll(s -> s.toUpperCase());
System.out.println(list); // ✅ Always works!

PostHeaderIcon Understanding Dependency Management and Resolution: A Look at Java, Python, and Node.js

Understanding Dependency Management and Resolution: A Look at Java, Python, and Node.js

Mastering how dependencies are handled can define your project’s success or failure. Let’s explore the nuances across today’s major development ecosystems.

Introduction

Every modern application relies heavily on external libraries. These libraries accelerate development, improve security, and enable integration with third-party services. However, unmanaged dependencies can lead to catastrophic issues — from version conflicts to severe security vulnerabilities. That’s why understanding dependency management and resolution is absolutely essential, particularly across different programming ecosystems.

What is Dependency Management?

Dependency management involves declaring external components your project needs, installing them properly, ensuring their correct versions, and resolving conflicts when multiple components depend on different versions of the same library. It also includes updating libraries responsibly and securely over time. In short, good dependency management prevents issues like broken builds, “dependency hell”, or serious security holes.

Java: Maven and Gradle

In the Java ecosystem, dependency management is an integrated and structured part of the build lifecycle, using tools like Maven and Gradle.

Maven and Dependency Scopes

Maven uses a declarative pom.xml file to list dependencies. A particularly important notion in Maven is the dependency scope.

Scopes control where and how dependencies are used. Examples include:

  • compile (default): Needed at both compile time and runtime.
  • provided: Needed for compile, but provided at runtime by the environment (e.g., Servlet API in a container).
  • runtime: Needed only at runtime, not at compile time.
  • test: Used exclusively for testing (JUnit, Mockito, etc.).
  • system: Provided by the system explicitly (deprecated practice).

<dependency>
  <groupId>junit</groupId>
  <artifactId>junit</artifactId>
  <version>4.13.2</version>
  <scope>test</scope>
</dependency>
    

This nuanced control allows Java developers to avoid bloating production artifacts with unnecessary libraries, and to fine-tune build behaviors. This is a major feature missing from simpler systems like pip or npm.

Gradle

Gradle, offering both Groovy and Kotlin DSLs, also supports scopes through configurations like implementation, runtimeOnly, testImplementation, which have similar meanings to Maven scopes but are even more flexible.


dependencies {
    implementation 'org.springframework.boot:spring-boot-starter'
    testImplementation 'org.springframework.boot:spring-boot-starter-test'
}
    

Python: pip and Poetry

Python dependency management is simpler, but also less structured compared to Java. With pip, there is no formal concept of scopes.

pip

Developers typically separate main dependencies and development dependencies manually using different files:

  • requirements.txt – Main project dependencies.
  • requirements-dev.txt – Development and test dependencies (pytest, tox, etc.).

This manual split is prone to human error and lacks the rigorous environment control that Maven or Gradle enforce.

Poetry

Poetry improves the situation by introducing a structured division:


[tool.poetry.dependencies]
requests = "^2.31"

[tool.poetry.dev-dependencies]
pytest = "^7.1"
    

Poetry brings concepts closer to Maven scopes, but they are still less fine-grained (no runtime/compile distinction, for instance).

Node.js: npm and Yarn

JavaScript dependency managers like npm and yarn allow a simple distinction between regular and development dependencies.

npm

Dependencies are declared in package.json under different sections:

  • dependencies – Needed in production.
  • devDependencies – Needed only for development (e.g., testing libraries, linters).

{
  "dependencies": {
    "express": "^4.18.2"
  },
  "devDependencies": {
    "mocha": "^10.2.0"
  }
}
    

While convenient, npm’s dependency management lacks Maven’s level of strictness around dependency resolution, often leading to version mismatches or “node_modules bloat.”

Key Differences Between Ecosystems

When switching between Java, Python, and Node.js environments, developers must be aware of the following fundamental differences:

1. Formality of Scopes

Java’s Maven/Gradle ecosystem defines scopes formally at the dependency level. Python (pip) and JavaScript (npm) ecosystems use looser, file- or section-based categorization.

2. Handling of Transitive Dependencies

Maven and Gradle resolve and include transitive dependencies automatically with sophisticated conflict resolution strategies (e.g., nearest version wins). pip historically had weak transitive dependency handling, leading to issues unless careful pinning is done. npm introduced better nested module flattening with npm v7+ but conflicts still occur in complex trees.

3. Lockfiles

npm/yarn and Python Poetry use lockfiles (package-lock.json, yarn.lock, poetry.lock) to ensure consistent dependency installations across machines. Maven and Gradle historically did not need lockfiles because they strictly followed declared versions and scopes. However, Gradle introduced lockfile support with dependency locking in newer versions.

4. Dependency Updating Strategy

Java developers often manually manage dependency versions inside pom.xml or use dependencyManagement blocks for centralized control. pip requires updating requirements.txt or regenerating them via pip freeze. npm/yarn allows semver rules (“^”, “~”) but auto-updating can lead to subtle breakages if not careful.

Best Practices Across All Languages

  • Pin exact versions wherever possible to avoid surprise updates.
  • Use lockfiles and commit them to version control (Git).
  • Separate production and development/test dependencies explicitly.
  • Use dependency scanners (e.g., OWASP Dependency-Check, Snyk, npm audit) regularly to detect vulnerabilities.
  • Prefer stable, maintained libraries with good community support and recent commits.

Conclusion

Dependency management, while often overlooked early in projects, becomes critical as applications scale. Maven and Gradle offer the most fine-grained controls via dependency scopes and conflict resolution. Python and JavaScript ecosystems are evolving rapidly, but require developers to be much more careful manually. Understanding these differences, and applying best practices accordingly, will ensure smoother builds, faster delivery, and safer production systems.

Interested in deeper dives into dependency vulnerability scanning, SBOM generation, or automatic dependency update pipelines? Subscribe to our blog for more in-depth content!

PostHeaderIcon [KotlinConf2023] Java and Kotlin: A Mutual Evolution

At KotlinConf2024, John Pampuch, Google’s production languages lead, delivered a history lesson on Java and Kotlin’s intertwined journeys. Battling jet lag with humor, John traced nearly three decades of Java and twelve years of Kotlin, emphasizing their complementary strengths. From Java’s robust ecosystem to Kotlin’s pragmatic innovation, the languages have shaped each other, accelerating progress. John’s talk, rooted in his experience since Java’s 1996 debut, explored design goals, feature cross-pollination, and future implications, urging developers to leverage Kotlin’s developer-friendly features while appreciating Java’s stability.

Design Philosophies: Pragmatism Meets Robustness

John opened by contrasting the languages’ origins. Java, launched in 1995, aimed for simplicity, security, and portability, aligning tightly with the JVM and JDK. Its ecosystem, bolstered by libraries and tooling, set a standard for enterprise development. Kotlin, announced in 2011 by JetBrains, prioritized pragmatism: concise syntax, interoperability with Java, and multiplatform flexibility. Unlike Java’s JVM dependency, Kotlin targets iOS, web, and beyond, enabling faster feature rollouts. John noted Kotlin’s design avoids Java’s rigidity, embracing object-oriented principles with practical tweaks like semicolon-free lines. Yet Java’s self-consistency, seen in its holistic lambda integration, complements Kotlin’s adaptability, creating a synergy where both thrive.

Feature Evolution: From Lambdas to Coroutines

The talk highlighted key milestones. Java’s 2014 release of JDK 8 introduced lambdas, default methods, and type inference, transforming APIs to support functional programming. Kotlin, with 1.0 in 2016, brought smart casts, string templates, and named arguments, prioritizing developer ease. By 2018, Kotlin’s coroutines revolutionized JVM asynchronous programming, offering a simpler mental model than Java’s threads. John praised coroutines as a potential game-changer, though Java’s 2023 virtual threads and structured concurrency aim to close the gap. Kotlin’s multiplatform support, cemented by Google’s 2017 Android endorsement, outpaces Java’s JVM-centric approach, but Java’s predictable six-month release cycle since 2017 ensures steady progress. These advancements reflect a race where each language pushes the other forward.

Mutual Influences: Sealed Classes and Beyond

John emphasized cross-pollination. Java’s 2021 records, inspired by frameworks like Lombok, mirror Kotlin’s data classes, though Kotlin’s named parameters reduce boilerplate further. Sealed classes, introduced in Java 17 and Kotlin 1.5 around 2021, emerged concurrently, suggesting shared inspiration. Kotlin’s string templates, a staple since its early days, influenced Java’s 2024 preview of flexible string templates, which John hopes Kotlin might adopt for localization. Java’s exploration of nullability annotations, potentially aligning with Kotlin’s robust null safety, shows ongoing convergence. John speculated that community demand could push Java toward features like named arguments, though JVM changes remain a hurdle. This mutual learning, fueled by competition with languages like Go and Rust, drives excitement and innovation.

Looking Ahead: Pragmatism and Compatibility

John concluded with a call to action: embrace Kotlin’s compact, readable features while valuing Java’s compile-time speed and ecosystem. Kotlin’s faster feature delivery and multiplatform prowess contrast with Java’s backwards compatibility and predictability. Yet both share a commitment to pragmatic evolution, avoiding breaks in millions of applications. Questions from the audience probed Java’s nullability and virtual threads, with John optimistic about eventual alignment but cautious about timelines. His talk underscored that Java and Kotlin’s competition isn’t zero-sum—it’s a catalyst for better tools, ideas, and developer experiences, ensuring both languages remain vital.

Hashtags: #Java #Kotlin

PostHeaderIcon Navigating the Reactive Frontier: Oleh Dokuka’s Reactive Streams at Devoxx France 2023

On April 13, 2023, Oleh Dokuka commanded the Devoxx France stage with a 44-minute odyssey titled “From imperative to Reactive: the Reactive Streams adventure!” Delivered at Paris’s Palais des Congrès, Oleh, a reactive programming luminary, guided developers through the paradigm shift from imperative to reactive programming. Building on his earlier R2DBC talk, he unveiled the power of Reactive Streams, a specification for non-blocking, asynchronous data processing. His narrative was a thrilling journey, blending technical depth with practical insights, inspiring developers to embrace reactive systems for scalable, resilient applications.

Oleh began with a relatable scenario: a Java application overwhelmed by high-throughput data, such as a real-time analytics dashboard. Traditional imperative code, with its synchronous loops and blocking calls, buckles under pressure, leading to latency spikes and resource exhaustion. “We’ve all seen threads waiting idly for I/O,” Oleh quipped, his humor resonating with the audience. Reactive Streams, he explained, offer a solution by processing data asynchronously, using backpressure to balance producer and consumer speeds. Oleh’s passion for reactive programming set the stage for a deep dive into its principles, tools, and real-world applications.

Embracing Reactive Streams

Oleh’s first theme was the core of Reactive Streams: a specification for asynchronous stream processing with non-blocking backpressure. He introduced its four interfaces—Publisher, Subscriber, Subscription, and Processor—and their role in building reactive pipelines. Oleh likely demonstrated a simple pipeline using Project Reactor, a Reactive Streams implementation:

Flux.range(1, 100)
    .map(i -> processData(i))
    .subscribeOn(Schedulers.boundedElastic())
    .subscribe(System.out::println);

In this demo, a Flux emits numbers, processes them asynchronously, and prints results, all while respecting backpressure. Oleh showed how the Subscription controls data flow, preventing the subscriber from being overwhelmed. He contrasted this with imperative code, where a loop might block on I/O, highlighting reactive’s efficiency for high-throughput tasks like log processing or event streaming. The audience, familiar with synchronous Java, leaned in, captivated by the prospect of responsive systems.

Building Reactive Applications

Oleh’s narrative shifted to practical application, his second theme. He explored integrating Reactive Streams with Spring WebFlux, a reactive web framework. In a demo, Oleh likely built a REST API handling thousands of concurrent requests, using Mono and Flux for non-blocking responses:

@GetMapping("/events")
Flux<Event> getEvents() {
    return eventService.findAll();
}

This API, running on Netty and leveraging virtual threads (echoing José Paumard’s talk), scaled effortlessly under load. Oleh emphasized backpressure strategies, such as onBackpressureBuffer(), to manage fast producers. He also addressed error handling, showing how onErrorResume() ensures resilience in reactive pipelines. For microservices or event-driven architectures, Oleh argued, Reactive Streams enable low-latency, resource-efficient systems, a must for cloud-native deployments.

Oleh shared real-world examples, noting how companies like Netflix use Reactor for streaming services. He recommended starting with small reactive components, such as a single endpoint, and monitoring performance with tools like Micrometer. His practical advice—test under load, tune buffer sizes—empowered developers to adopt reactive programming incrementally.

Reactive in the Ecosystem

Oleh’s final theme was Reactive Streams’ role in Java’s ecosystem. Libraries like Reactor, RxJava, and Akka Streams implement the specification, while frameworks like Spring Boot 3 integrate reactive data access via R2DBC (from his earlier talk). Oleh highlighted compatibility with databases like MongoDB and Kafka, ideal for reactive pipelines. He likely demonstrated a reactive Kafka consumer, processing messages with backpressure:

KafkaReceiver.create(receiverOptions)
    .receive()
    .flatMap(record -> processRecord(record))
    .subscribe();

This demo showcased seamless integration, reinforcing reactive’s versatility. Oleh urged developers to explore Reactor’s documentation and experiment with Spring WebFlux, starting with a prototype project. He cautioned about debugging challenges, suggesting tools like BlockHound to detect blocking calls. Looking ahead, Oleh envisioned reactive systems dominating data-intensive applications, from IoT to real-time analytics.

As the session closed, Oleh’s enthusiasm sparked hallway discussions about reactive programming’s potential. Developers left with a clear path: build a reactive endpoint, integrate with Reactor, and measure scalability. Oleh’s adventure through Reactive Streams was a testament to Java’s adaptability, inspiring a new era of responsive, cloud-ready applications.

PostHeaderIcon A Decade of Devoxx FR and Java Evolution: A Detailed Retrospective and Forward-Looking Analysis

Introduction:

The Devoxx FR conference has served as a key barometer of the Java platform’s dynamic evolution over the past ten years. This period has been marked by numerous releases, including major advancements that have significantly reshaped how we architect, develop, and deploy Java applications. This presentation offers a detailed retrospective analysis of significant announcements and the substantial changes within Java, emphasizing the critical importance of embracing these enhancements to optimize our applications for performance, maintainability, and security. Beyond a surface-level examination of syntax and API modifications, this session provides a comprehensive rationale for migrating to newer Java versions, addressing the common concerns and challenges that often accompany such transitions with practical insights and actionable strategies.

1. A Detailed Look Back: Java’s Evolution Over the Past Decade

Jean-Michel “JM” Doudoux begins the session by establishing a parallel timeline of the ten-year history of the Devoxx FR conference and Java’s continuous development. He emphasizes the importance of understanding the reception and adoption rates of different Java versions to contextualize the current state of the Java ecosystem.

Java 8:

JM highlights Java 8 as a watershed release, noting its widespread adoption and the introduction of transformative features that fundamentally changed Java development. Key features include:

  • Lambda Expressions: Revolutionized functional programming in Java, enabling more concise and expressive code.
  • Stream API: Introduced a powerful and efficient way to process collections of data.
  • Method References: Simplified the syntax for referring to methods, further enhancing code readability.
  • New Date/Time API (java.time): Addressed the shortcomings of the old java.util.Date and java.util.Calendar APIs, providing a more robust and intuitive way to handle date and time.
  • Default Methods in Interfaces: Allowed adding new methods to interfaces without breaking backward compatibility.

Java 11:

JM points out the slower adoption rate of Java 11, despite being a Long-Term Support (LTS) release, which typically encourages enterprise adoption due to extended support guarantees. Notable features include:

  • HTTP Client API: Introduced a new and improved HTTP Client API, supporting HTTP/2 and WebSocket.

Java 17:

Characterized as a release that has garnered significant developer enthusiasm, building upon the foundation laid by previous versions and further refining the language.

Java 9:

Acknowledged as a disruptive release, primarily due to the introduction of the Java Platform Module System (JPMS), which brought modularity to Java. Doudoux discusses the profound impact of modularity on the Java ecosystem, affecting code organization, accessibility, and deployment.

Java 10, 12-16:

These releases are characterized as more transient, feature releases, with less widespread adoption compared to the LTS versions. However, they introduced valuable features such as:

  • Local Variable Type Inference (var): Simplified variable declaration.
  • Enhanced Switch Expressions: Improved the switch statement, making it more expressive and usable as an expression.

2. Navigating Migration: Java 17 and Strategic Considerations

The presentation transitions to a practical discussion on the complexities of migrating to newer Java versions, with a strong emphasis on the benefits and challenges of migrating to Java 17. Doudoux addresses the common obstacles developers encounter when advocating for migration within their organizations, particularly the challenge of securing buy-in from operations teams and management.

Strategies for Persuasion:

The speaker offers valuable strategies to help developers build a compelling case for migration, focusing on:

  • Highlighting Performance Improvements: Emphasizing the performance gains offered by newer Java versions.
  • Improved Security: Stressing the importance of security updates and enhancements.
  • Increased Developer Productivity: Showcasing how new language features can streamline development workflows.
  • Long-Term Maintainability: Arguing that staying on older versions increases technical debt and maintenance costs in the long run.

Migration Considerations:

While a detailed, step-by-step migration guide is beyond the scope of the session, Doudoux outlines the essential high-level considerations and key steps involved in the migration process, such as:

  • Dependency Analysis: Assessing compatibility with updated libraries and frameworks.
  • Testing: Thoroughly testing the application after migration.
  • Gradual Rollouts: Considering phased deployments to minimize risk.

3. The Future of Java: Trends and Directions

The session concludes with a concise yet insightful look at the future trajectory of the Java platform. This segment provides a glimpse into upcoming features, emerging trends, and the ongoing evolution of Java, ensuring the audience is aware of the continuous innovation within the Java ecosystem.

Summary:

This presentation provides a detailed and comprehensive overview of Java’s journey over the past decade, carefully contextualized within the parallel evolution of the Devoxx FR conference. It goes beyond a simple recitation of features, offering in-depth analysis of the impact of key advancements, practical guidance on navigating the complexities of Java migration, and a valuable perspective on the future of the platform.