Posts Tagged ‘Kotlin’
Advanced Encoding in Java, Kotlin, Node.js, and Python
Encoding is essential for handling text, binary data, and secure transmission across applications. Understanding advanced encoding techniques can help prevent data corruption and ensure smooth interoperability across systems. This post explores key encoding challenges and how Java/Kotlin, Node.js, and Python tackle them.
1️⃣ Handling Special Unicode Characters (Emoji, Accents, RTL Text)
Java/Kotlin
Java uses UTF-16 internally, but for external data (JSON, databases, APIs), explicit encoding is required:
String text = "🔧 Café مرحبا";
byte[] utf8Bytes = text.getBytes(StandardCharsets.UTF_8);
String decoded = new String(utf8Bytes, StandardCharsets.UTF_8);
System.out.println(decoded); // 🔧 Café مرحبا
✅ Tip: Always specify StandardCharsets.UTF_8
to avoid platform-dependent defaults.
Node.js
const text = "🔧 Café مرحبا";
const utf8Buffer = Buffer.from(text, 'utf8');
const decoded = utf8Buffer.toString('utf8');
console.log(decoded); // 🔧 Café مرحبا
✅ Tip: Using an incorrect encoding (e.g., latin1
) may corrupt characters.
Python
text = "🔧 Café مرحبا"
utf8_bytes = text.encode("utf-8")
decoded = utf8_bytes.decode("utf-8")
print(decoded) # 🔧 Café مرحبا
✅ Tip: Python 3 handles Unicode by default, but explicit encoding is always recommended.
2️⃣ Encoding Binary Data for Transmission (Base64, Hex, Binary Files)
Java/Kotlin
byte[] data = "Hello World".getBytes(StandardCharsets.UTF_8);
String base64Encoded = Base64.getEncoder().encodeToString(data);
byte[] decoded = Base64.getDecoder().decode(base64Encoded);
System.out.println(new String(decoded, StandardCharsets.UTF_8)); // Hello World
Node.js
const data = Buffer.from("Hello World", 'utf8');
const base64Encoded = data.toString('base64');
const decoded = Buffer.from(base64Encoded, 'base64').toString('utf8');
console.log(decoded); // Hello World
Python
import base64
data = "Hello World".encode("utf-8")
base64_encoded = base64.b64encode(data).decode("utf-8")
decoded = base64.b64decode(base64_encoded).decode("utf-8")
print(decoded) # Hello World
✅ Tip: Base64 encoding increases data size (~33% overhead), which can be a concern for large files.
3️⃣ Charset Mismatches and Cross-Language Encoding Issues
A file encoded in ISO-8859-1 (Latin-1) may cause garbled text when read using UTF-8.
Java/Kotlin Solution:
byte[] bytes = Files.readAllBytes(Paths.get("file.txt"));
String text = new String(bytes, StandardCharsets.ISO_8859_1);
Node.js Solution:
const fs = require('fs');
const text = fs.readFileSync("file.txt", { encoding: "latin1" });
Python Solution:
with open("file.txt", "r", encoding="ISO-8859-1") as f:
text = f.read()
✅ Tip: Always specify encoding explicitly when working with external files.
4️⃣ URL Encoding and Decoding
Java/Kotlin
String encoded = URLEncoder.encode("Hello World!", StandardCharsets.UTF_8);
String decoded = URLDecoder.decode(encoded, StandardCharsets.UTF_8);
Node.js
const encoded = encodeURIComponent("Hello World!");
const decoded = decodeURIComponent(encoded);
Python
from urllib.parse import quote, unquote
encoded = quote("Hello World!")
decoded = unquote(encoded)
✅ Tip: Use UTF-8 for URL encoding to prevent inconsistencies across different platforms.
Conclusion: Choosing the Right Approach
- Java/Kotlin: Strong type safety, but requires careful
Charset
management. - Node.js: Web-friendly but depends heavily on
Buffer
conversions. - Python: Simple and concise, though strict type conversions must be managed.
📌 Pro Tip: Always be explicit about encoding when handling external data (APIs, files, databases) to avoid corruption.
CTO Perspective: Choosing a Tech Stack for Mainframe Rebuild
Original post
From LinkedIn: https://www.linkedin.com/posts/matthias-patzak_cto-technology-activity-7312449287647375360-ogNg?utm_source=share&utm_medium=member_desktop&rcm=ACoAAAAWqBcBNS5uEX9jPi1JPdGxlnWwMBjXwaw
Summary of the question
As CTO for a mainframe rebuild (core banking/insurance/retail app, 100 teams/1000 people with Cobol expertise), considering Java/Kotlin, TypeScript/Node.js, Go, and Python. Key decision criteria are technical maturity/stability, robust community, and innovation/adoption. The CTO finds these criteria sound and seeks a language recommendation.
TL;DR: my response
- Team, mainframe rebuild: Java/Kotlin are frontrunners due to maturity, ecosystem, and team’s Java-adjacent skills. Go has niche potential. TypeScript/Node.js and Python less ideal for core.
- Focus now: deep PoC comparing Java (Spring Boot) vs. Kotlin on our use cases. Evaluate developer productivity, readability, interoperability, performance.
- Develop comprehensive Java/Kotlin training for our 100 Cobol-experienced teams.
- Strategic adoption plan (Java, Kotlin, or hybrid) based on PoC and team input is next.
- This balances proven stability with modern practices on the JVM for our core.
My detailed opinion
As a CTO with experience in these large-scale transformations, my priority remains a solution that balances technical strength with the pragmatic realities of our team’s current expertise and long-term maintainability.
While Go offers compelling performance characteristics, the specific demands of our core business application – be it in banking, insurance, or retail – often prioritize a mature ecosystem, robust enterprise patterns, and a more gradual transition path for our significant team. Given our 100 teams deeply skilled in Cobol, the learning curve and the availability of readily transferable concepts become key considerations.
Therefore, while acknowledging Go’s strengths in certain cloud-native scenarios, I want to emphasize the strategic advantages of the Java/Kotlin ecosystem for our primary language choice, with a deliberate hesitation and deeper exploration between these two JVM-based options.
Re-emphasizing Java and Exploring Kotlin More Deeply:
-
Java’s Enduring Strength: Java’s decades of proven stability in building mission-critical enterprise systems cannot be overstated. The JVM’s resilience, the vast array of mature libraries and frameworks (especially Spring Boot), and the well-established architectural patterns provide a solid and predictable foundation. Moreover, the sheer size of the Java developer community ensures a deep pool of talent and readily available support for our teams as they transition. For a core system in a regulated industry, this level of established maturity significantly mitigates risk.
-
Kotlin’s Modern Edge and Interoperability: Kotlin presents a compelling evolution on the JVM. Its modern syntax, null safety features, and concise code can lead to increased developer productivity and reduced boilerplate – benefits I’ve witnessed firsthand in JVM-based projects. Crucially, Kotlin’s seamless interoperability with Java is a major strategic advantage. It allows us to:
- Gradually adopt Kotlin: Teams can start by integrating Kotlin into existing Java codebases, allowing for a phased learning process without a complete overhaul.
- Leverage the entire Java ecosystem: Kotlin developers can effortlessly use any Java library or framework, giving us access to the vast resources of the Java world.
- Attract modern talent: Kotlin’s growing popularity can help us attract developers who are excited about working with a modern, yet stable, language on a proven platform.
Why Hesitate Between Java and Kotlin?
The decision of whether to primarily adopt Java or Kotlin (or a strategic mix) requires careful consideration of our team’s specific needs and the long-term vision:
- Learning Curve: While Kotlin is designed to be approachable for Java developers, there is still a learning curve associated with its new syntax and features. We need to assess how quickly our large Cobol-experienced team can become proficient in Kotlin.
- Team Preference and Buy-in: Understanding our developers’ preferences and ensuring buy-in for the chosen language is crucial for successful adoption.
- Long-Term Ecosystem Evolution: While both Java and Kotlin have strong futures on the JVM, we need to consider the long-term trends and the level of investment in each language within the enterprise space.
- Specific Use Cases: Certain parts of our system might benefit more from Kotlin’s conciseness or specific features, while other more established components might initially remain in Java.
Proposed Next Steps (Revised Focus):
- Targeted Proof of Concept (PoC) – Deep Dive into Java and Kotlin: Instead of a broad PoC including Go, let’s focus our initial efforts on a detailed comparison of Java (using Spring Boot) and Kotlin on representative use cases from our core business application. This PoC should specifically evaluate:
- Developer Productivity: How quickly can teams with a Java-adjacent mindset (after initial training) develop and maintain code in both languages?
- Code Readability and Maintainability: How do the resulting codebases compare in terms of clarity and ease of understanding for a large team?
- Interoperability Scenarios: How seamlessly can Java and Kotlin code coexist and interact within the same project?
- Performance Benchmarking: While the JVM provides a solid base, are there noticeable performance differences for our specific workloads?
- Comprehensive Training and Upskilling Program: We need to develop a detailed training program that caters to our team’s Cobol background and provides clear pathways for learning both Java and Kotlin. This program should include hands-on exercises and mentorship opportunities.
- Strategic Adoption Plan: Based on the PoC results and team feedback, we’ll develop a strategic adoption plan that outlines whether we’ll primarily focus on Java, Kotlin, or a hybrid approach. This plan should consider the long-term maintainability and talent acquisition goals.
While Go remains a valuable technology for specific niches, for the core of our mainframe rebuild, our focus should now be on leveraging the mature and evolving Java/Kotlin ecosystem and strategically determining the optimal path for our large and experienced team. This approach minimizes risk while embracing modern development practices on a proven platform.
[KotlinConf’2023] Coroutines and Loom: A Deep Dive into Goals and Implementations
The advent of OpenJDK’s Project Loom and its virtual threads has sparked considerable discussion within the Java and Kotlin communities, particularly regarding its relationship with Kotlin Coroutines. Roman Elizarov, Project Lead for Kotlin at JetBrains, addressed this topic head-on at KotlinConf’23 in his talk, “Coroutines and Loom behind the scenes”. His goal was not just to answer whether Loom would make coroutines obsolete (the answer being a clear “no”), but to delve into the distinct design goals, implementations, and trade-offs of each, clarifying how they can coexist and even complement each other. Information about Project Loom can often be found via OpenJDK resources or articles like those on Baeldung.
Roman began by noting that Project Loom, introducing virtual threads to the JVM, was nearing stability, targeted for Java 21 (late 2023). He emphasized that understanding the goals behind each technology is crucial, as these goals heavily influence their design and optimal use cases.
Project Loom: Simplifying Server-Side Concurrency
Project Loom’s primary design goal, as Roman Elizarov explained, is to preserve the thread-per-request programming style prevalent in many existing Java server-side applications, while dramatically increasing scalability. Traditionally, assigning one platform thread per incoming request becomes a bottleneck due to the high cost of platform threads. Virtual threads aim to solve this by providing lightweight, JVM-managed threads that can run existing synchronous, blocking Java code with minimal or no changes. This allows legacy applications to scale much better without requiring a rewrite to asynchronous or reactive patterns.
Loom achieves this by “unmounting” a virtual thread from its carrier (platform) thread when it encounters a blocking operation (like I/O) that has been integrated with Loom. The carrier thread is then free to run other virtual threads. When the blocking operation completes, the virtual thread is “remounted” on a carrier thread to continue execution. This mechanism is largely transparent to the application code. However, Roman pointed out a potential pitfall: if blocking operations occur within synchronized
blocks or native JNI calls that haven’t been adapted for Loom, the carrier thread can get “pinned,” preventing unmounting and potentially negating some of Loom’s benefits in those specific scenarios.
Kotlin Coroutines: Fine-Grained, Structured Concurrency
In contrast, Kotlin Coroutines were designed with different primary goals:
- Enable fine-grained concurrency: Allowing developers to easily launch tens of thousands or even millions of concurrent tasks without performance issues, suitable for highly concurrent applications like UI event handling or complex data processing pipelines.
- Provide structured concurrency: Ensuring that the lifecycle of coroutines is managed within scopes, simplifying cancellation and preventing resource leaks. This is particularly critical for UI applications where tasks need to be cancelled when UI components are destroyed.
Kotlin Coroutines achieve this through suspendable functions (suspend fun
) and a compiler-based transformation. When a coroutine suspends, it doesn’t block its underlying thread; instead, its state is saved, and the thread is released to do other work. This is fundamentally different from Loom’s approach, which aims to make blocking calls non-problematic for virtual threads. Coroutines explicitly distinguish between suspending and non-suspending code, a design choice that enables features like structured concurrency but requires a different programming model than traditional blocking code.
Comparing Trade-offs and Performance
Roman Elizarov presented a detailed comparison:
- Programming Model: Loom aims for compatibility with existing blocking code. Coroutines introduce a new model with suspend functions, which is more verbose for simple blocking calls but enables powerful features like structured concurrency and explicit cancellation. Forcing blocking calls into a coroutine world requires wrappers like withContext(Dispatchers.IO), while Loom handles blocking calls transparently on virtual threads.
- Cost of Operations:
- Launching: Launching a coroutine is significantly cheaper than starting even a virtual thread, as coroutines are lighter weight objects.
- Yielding/Suspending: Suspending a coroutine is generally cheaper than a virtual thread yielding (unmounting/remounting), due to compiler optimizations in Kotlin for state machine management. Roman showed benchmarks indicating lower memory allocation and faster execution for coroutine suspension compared to virtual thread context switching in preview builds of Loom.
- Error Handling & Cancellation: Coroutines have built-in, robust support for structured cancellation. Loom’s virtual threads rely on Java’s traditional thread interruption mechanisms, which are less integrated into the programming model for cooperative cancellation.
- Debugging: Loom’s virtual threads offer a debugging experience very similar to traditional threads, with understandable stack traces. Coroutines, due to their state-machine nature, can sometimes have more complex stack traces, though IDE support has improved this.
Coexistence and Future Synergies
Roman Elizarov concluded that Loom and coroutines are designed for different primary use cases and will coexist effectively.
- Loom excels for existing Java applications using the thread-per-request model that need to scale without major rewrites.
- Coroutines excel for applications requiring fine-grained, highly concurrent operations, structured concurrency, and explicit cancellation management, often seen in UI applications or complex backend services with many interacting components.
He also highlighted a potential future synergy: Kotlin Coroutines could leverage Loom’s virtual threads for their Dispatchers.IO
(or a similar dispatcher) when running on newer JVMs. This could allow blocking calls within coroutines (those wrapped in withContext(Dispatchers.IO)
) to benefit from Loom’s efficient handling of blocking operations, potentially eliminating the need for a large, separate thread pool for I/O-bound tasks in coroutines. This would combine the benefits of both: coroutines for structured, fine-grained concurrency and Loom for efficient handling of any unavoidable blocking calls.
Links:
Hashtags: #Kotlin #Coroutines #ProjectLoom #Java #JVM #Concurrency #AsynchronousProgramming #RomanElizarov #JetBrains
[KotlinConf’23] The Future of Kotlin is Bright and Multiplatform
KotlinConf’23 kicked off with an energizing keynote, marking a highly anticipated return to an in-person format in Amsterdam. Hosted by Hadi Hariri from JetBrains, the session brought together key figures from both JetBrains and Google, including Roman Elizarov, Svetlana Isakova, Egor Tolstoy, and Grace Kloba (VP of Engineering for Android Developer Experience at Google), to share exciting updates and future directions for the Kotlin language and its ecosystem. The conference also boasted a global reach with KotlinConf Global events held across 41 countries. For those unable to attend, the key announcements from the keynote are also available in a comprehensive blog post on the official Kotlin blog.
The keynote began by celebrating Kotlin’s impressive growth, with compelling statistics underscoring its widespread adoption, particularly in Android development where it stands as the most popular language, utilized in over 95% of the top 1000 Android applications. A significant emphasis was placed on the forthcoming Kotlin 2.0, which is centered around the revolutionary new K2 compiler. This compiler promises significant performance improvements, enhanced stability, and a robust foundation for the language’s future evolution. The K2 compiler is nearing completion and is slated for release as Kotlin 2.0. Additionally, the IntelliJ IDEA plugin will also adopt the K2 frontend, ensuring alignment with IntelliJ releases and a consistent developer experience.
The Evolution of Kotlin: K2 Compiler and Language Features
The K2 compiler was a central theme of the keynote, signifying a major milestone for Kotlin. This re-architected compiler frontend, which also powers the IDE, is designed to be faster, more stable, and to enable quicker development of new language features and tooling capabilities. Kotlin 2.0, built upon the K2 compiler, is set to bring these profound benefits to all Kotlin developers, improving both compiler performance and IDE responsiveness.
Beyond the immediate horizon of Kotlin 2.0, the speakers provided a glimpse into potential future language features that are currently under consideration. These exciting prospects included:
Prospective Language Enhancements
- Static Extensions: This feature aims to allow static resolution of extension functions, which could potentially improve performance and code clarity.
- Collection Literals: The introduction of a more concise syntax for creating collections, such as using square brackets for lists, with efficient underlying implementations, is on the cards.
- Name-Based Destructuring: Offering a more flexible way to destructure objects based on property names rather than simply their positional order.
- Context Receivers: A powerful capability designed to provide contextual information to functions in a more implicit and structured manner. This feature, however, is being approached with careful consideration to ensure it aligns well with Kotlin’s core principles and doesn’t introduce undue complexity.
- Explicit Fields: This would provide developers with more direct control over the backing fields of properties, offering greater flexibility in certain scenarios.
The JetBrains team underscored a cautious and deliberate approach to language evolution, ensuring that any new features are meticulously designed and maintainable within the Kotlin ecosystem. Compiler plugins were also highlighted as a powerful mechanism for extending Kotlin’s capabilities without altering its core.
Kotlin in the Ecosystem: Google’s Investment and Multiplatform Growth
Grace Kloba from Google took the stage to reiterate Google’s strong and unwavering commitment to Kotlin. She shared insights into Google’s substantial investments in the Kotlin ecosystem, including the development of Kotlin Symbol Processing (KSP) and the continuous emphasis on Kotlin as the default choice for Android development. Google officially championed Kotlin for Android development as early as 2017, a pivotal moment for the language’s widespread adoption. Furthermore, the Kotlin DSL is now the default for Gradle build scripts within Android Studio, significantly enhancing the developer experience with features such as semantic syntax highlighting and advanced code completion. Google also actively contributes to the Kotlin Foundation and encourages community participation through initiatives like the Kotlin Foundation Grants Program, which specifically focuses on supporting multiplatform libraries and frameworks.
Kotlin Multiplatform (KMP) emerged as another major highlight of the keynote, emphasizing its increasing maturity and widespread adoption. The overarching vision for KMP is to empower developers to share code across a diverse range of platforms—Android, iOS, desktop, web, and server-side—while retaining the crucial ability to write platform-specific code when necessary for optimal integration and performance. The keynote celebrated the burgeoning number of multiplatform libraries and tools, including KMM Bridge, which are simplifying KMP development workflows. The future of KMP appears exceptionally promising, with ongoing efforts to further enhance the developer experience and expand its capabilities across even more platforms.
Compose Multiplatform and Emerging Technologies
The keynote also featured significant advancements in Compose Multiplatform, JetBrains’ declarative UI framework for building cross-platform user interfaces. A particularly impactful announcement was the alpha release of Compose Multiplatform for iOS. This groundbreaking development allows developers to write their UI code once in Kotlin and deploy it seamlessly across Android and iOS, and even to desktop and web targets. This opens up entirely new avenues for code sharing and promises accelerated development cycles for mobile applications, breaking down traditional platform barriers.
Finally, the JetBrains team touched upon Kotlin’s expansion into truly emerging technologies, such as WebAssembly (Wasm). JetBrains is actively developing a new compiler backend for Kotlin specifically targeting WebAssembly, coupled with its own garbage collection proposal. This ambitious effort aims to deliver high-performance Kotlin code directly within the browser environment. Experiments involving the execution of Compose applications within the browser using WebAssembly were also mentioned, hinting at a future where Kotlin could offer a unified development experience across an even broader spectrum of platforms. The keynote concluded with an enthusiastic invitation to the community to delve deeper into these subjects during the conference sessions and to continue contributing to Kotlin’s vibrant and ever-expanding ecosystem.
Links:
- Blog Post on KotlinConf’23 Keynote Highlights
- JetBrains Website
- JetBrains on LinkedIn
- Grace Kloba – KotlinConf’23 Speaker Profile
Hashtags: #Keynote #JetBrains #Google #K2Compiler #Kotlin2 #Multiplatform #ComposeMultiplatform #WebAssembly
Kotlin Native Concurrency Explained by Kevin Galligan
Navigating Kotlin/Native’s Concurrency Model
At KotlinConf 2019 in Copenhagen, Kevin Galligan, a partner at Touchlab with over 20 years of software development experience, delivered a 39-minute talk on Kotlin/Native’s concurrency model. Kevin Galligan explored the restrictive yet logical rules governing state and concurrency in Kotlin/Native, addressing their controversy among JVM and mobile developers. He explained the model’s mechanics, its rationale, and best practices for multiplatform development. This post covers four key themes: the core rules of Kotlin/Native concurrency, the role of workers, the impact of freezing state, and the introduction of multi-threaded coroutines.
Core Rules of Kotlin/Native Concurrency
Kevin Galligan began by outlining Kotlin/Native’s two fundamental concurrency rules: mutable state is confined to a single thread, and immutable state can be shared across multiple threads. These rules, known as thread confinement, mirror mobile development practices where UI updates are restricted to the main thread. In Kotlin/Native, the runtime enforces these constraints, preventing mutable state changes from background threads to avoid race conditions. Kevin emphasized that while these rules feel restrictive compared to the JVM’s shared-memory model, they align with modern platforms like Go and Rust, which also limit unrestricted shared state.
The rationale behind this model, as Kevin explained, is to reduce concurrency errors by design. Unlike the JVM, which trusts developers to manage synchronization, Kotlin/Native’s runtime verifies state access at runtime, crashing if rules are violated. This strictness, though initially frustrating, encourages intentional state management. Kevin noted that after a year of working with Kotlin/Native, he found the model simple and effective, provided developers embrace its constraints rather than fight them.
Workers as Concurrency Primitives
A central concept in Kevin’s talk was the Worker
, a Kotlin/Native concurrency queue similar to Java’s ExecutorService
or Android’s Handler
and Looper
. Workers manage a job queue processed by a private thread, ensuring thread confinement. Kevin illustrated how a Worker
executes tasks via the execute
function, which takes a producer function to verify state transfer between threads. The execute
function supports safe and unsafe transfer modes, with Kevin strongly advising against the unsafe mode due to its bypassing of state checks.
Using a code example, Kevin demonstrated passing a data class to a Worker
. The runtime freezes the data—making it immutable—to comply with concurrency rules, preventing illegal state transfers. He highlighted that while Worker
is a core primitive, developers rarely use it directly, as higher-level abstractions like coroutines are preferred. However, understanding Worker
is crucial for grasping Kotlin/Native’s concurrency mechanics, especially when debugging state-related errors like IllegalStateTransfer
.
Freezing State and Its Implications
Kevin Galligan delved into the concept of freezing, a runtime mechanism that designates objects as immutable for safe sharing across threads. Freezing is a one-way operation, recursively applying to an object and its references, with no unfreeze option. This ensures thread safety but introduces challenges, as frozen objects cannot be mutated, leading to InvalidMutabilityException
errors if attempted.
In a practical example, Kevin showed how capturing mutable state in a background task can inadvertently freeze an entire object graph, causing runtime failures. He introduced tools like ensureNeverFrozen
to debug unintended freezing and stressed intentional mutability—keeping mutable state local to one thread and transforming data into frozen copies for sharing. Kevin also discussed Atomic
types, which allow limited mutation of frozen state, but cautioned against overusing them due to performance and memory issues. His experience at Touchlab revealed early missteps with global state and Atomics
, leading to a shift toward confined state models.
Multi-Threaded Coroutines and Future Directions
A significant update in Kevin’s talk was the introduction of multi-threaded coroutines, enabled by a draft pull request in 2019. Previously, Kotlin/Native coroutines were single-threaded, limiting concurrency and stunting library development. The new model allows coroutines to switch threads using dispatchers, with data passed between threads frozen to maintain strict mode. Kevin demonstrated replacing a custom background function with a coroutine-based approach, simplifying concurrency while adhering to state rules.
This development clarified the longevity of strict mode, countering speculation about a relaxed mode that would mimic JVM-style shared memory. Kevin noted that multi-threaded coroutines unblocked library development, citing projects like AtomicFu
and SQLDelight
. He also highlighted Touchlab’s Droidcon app, which adopted multi-threaded coroutines for production, showcasing their practical viability. Looking forward, Kevin anticipated increased community adoption and library growth in 2020, urging developers to explore the model despite its learning curve.
Conclusion
Kevin Galligan’s KotlinConf 2019 talk demystifies Kotlin/Native’s concurrency model, offering a clear path for developers navigating its strict rules. By embracing thread confinement, leveraging workers, managing frozen state, and adopting multi-threaded coroutines, developers can build robust multiplatform applications. This talk is a must for Kotlin/Native enthusiasts seeking to master concurrency in modern mobile development.
Links
- Watch the full talk on YouTube
- Touchlab
- American Express
- KotlinConf
- JetBrains
- Kotlin Website
- Kotlin/Native Repository
Hashtags: #KevinGalligan #KotlinNative #Concurrency #Touchlab #JetBrains #Multiplatform