It has been around for almost 30 years, and still shows no signs of retiring. Java was there when the web was taking its first steps, and has accompanied it throughout the decades. It has steadily changed, evolving based on the protean needs of internet users and developers, from the early applets to today’s blockchain and Web3. We can only imagine where it will be 30 years from now. 

In this four-part retrospective, written in collaboration with Danilo Ventura, Senior Software Engineer at ProActivity (our sister company part of Fortitude Group), we attempt at tracing the history of Java and the role it has played in the development of the web as we know it. The aim is to identify the reason why Java succeeded in lieu of other languages and technologies, especially in the early hectic, experimental days of the internet. That was when the programming language and the web were influencing each other, to fully unlock the potential of the technology that has changed the world forever. 


It all started in the mid 1990s. It was the best of times, it was the worst of times. The World Wide Web was still in its infancy and was not readily accessible by the general public. Only tech-savvy enthusiasts connected their computers to the internet to share content and talk with strangers on the message boards. 

The birth and development of the web had been made possible by the creation and development of a simple protocol called HTTP (Hypertext Transfer Protocol), first introduced by Tim Berners-Lee and his team in 1991 and revised as 1.0 HTTP five years later. Since then the protocol has continuously evolved to become more efficient and secure - 2022 saw the launch of HTTP 3.0 - but the underlying principles are still valid and constitute the foundation for today’s web applications. 

HTTP works as a straightforward request–response protocol: the client submits a request to the server on the internet, which in turn provides a resource such as a document, content or a piece of information. This conceptual simplicity of HTTP has ensured its resilience throughout the years. We can see a sort of Darwinian principle at play, by which only the simple, useful and evolvable technologies pass the test of time.

The documents exchanged between computers via the HTTP protocol are written in HTML, i.e. HyperText Markup Language. Since its introduction in 1991, HTML has been used to describe the structure and content of web pages. At first, these were crude text documents with some basic formatting, such as bold and italic. Later on, the evolution of HTML and the addition of accompanying technologies such as CSS enabled more formatting and content options, such as images, tables, animations, etc.. 

In order to be accessed by human readers, HTML web pages need to be decoded by a web browser, namely the other great technology that enabled the birth and development of the internet. Browsers were created as simple programs capable of requesting resources via the HTTP protocol, receiving HTML documents and rendering them as a readable web page.

At the time the Web was primitive, with very few people accessing it for leisure or work. It is reported that in 1995 only 44M people had access to the internet globally, with half of them being located in the United States (source: Our World in Data). Business applications were scarce, but some pioneers were experimenting with home banking and electronic commerce services. In 1995, Wells Fargo allowed customers to access their bank account from a computer, while Amazon and AuctionWeb - later known as eBay - took their first steps in the world of online shopping. 

The main limiting factors to the web’s democratization as a business tool were technological. Needs were changing, with users reclaiming an active role in their online presence. At the same time, website creators wanted easier ways to offer more speed, more flexibility, and the possibility to interact with an online document or web page. In this regard, the introduction of Java was about to give a big boost to the evolution of the online landscape.

The first public implementation of Java was released by Sun Microsystems in January 1996. It was designed by frustrated programmers who were tired of fighting with the complexity of the solutions available at the time. The aim was to create a simple, robust, object-oriented language that would not generate operating system-specific code

That was arguably the most revolutionary feature. Before Java, programmers wrote code in the preferred language and then used an OS-specific compiler to translate the source code into object code, thus creating an OS-specific program. In order to make the same program compatible with other systems, the code had to be rewritten and retranslated with the appropriate compiler. 

Java instead allowed programmers to “write once, run everywhere” - that was its motto. Developers could write code on any device and generate a metalanguage, called bytecode, that could be run on all operating systems and platforms equipped with a Java Virtual Machine. It was a game changer for web developers, for they did not have to worry anymore about the machine and OS running the program.

This flexibility guaranteed Java’s success as a tool to create multi-OS desktop applications with user-friendly interfaces. Supported by the coeval spread of the first mass OS for lay people (Windows 95), it helped the codification of the classic computer program visual grammar, still relevant today. Java also became one of the preferred standards for programs running on household appliances, such as washing machines or TV sets.  

The birth of applets can be seen as a milestone in the development and popularization of Java. These were small Java applications that could be launched directly from a webpage. A specific HTML tag indicated the server location of the bytecode, which was downloaded and executed on the fly in the browser window itself.

Applets allowed a higher degree of interactivity than HTML, and were for instance used for games and data visualization. The first web browser supporting applets was Sun Microsystem’s own HotJava, released in 1997, with all major competitors following soon thereafter. 

A Java Applet used for data visualization (source: Wikimedia)

Java Applets were pioneering attempts at transforming the web into an interactive space. Yet, they had security issues that contributed to their gradual demise in the early 2010s, when all major browsers started to terminate the support for the underlying technology. One of the last great applets was Minecraft, which was first introduced as a Java-based browser game in 2009. Java Applets were officially discontinued in 2017.

We can say that the goal of making HTML web pages more interactive has been fully achieved thanks to JavaScript, another great creation of the mid-nineties. Yet, despite the name, it has nothing to do with Java, apart from some similarities in the syntax and libraries. It was actually introduced in 1995 by Netscape as LiveScript and then rebranded JavaScript for marketing purposes. It is not a programming language, but rather an OOP scripting language that runs in a browser and enhances web pages with interactive elements. JavaScript has now become dominant, being used in 2022 by 98% of all websites (source: w3techs).

At the same time, another Java technology, RMI (Remote Method Invocation), and later RMI-IIOP (RMI on Internet Inter-Orb Protocol), enabled distributed computing based on the Object Oriented paradigm in a Java Virtual Machine. In the early 2000s, it was possible to develop web applications with Applets that, thanks to RMI-services, could retrieve data from a server, all based on JVMs. 

The next step in the evolutionary path were Java servlets, which paved the way for your typical Web 1.0 applications. Servlet allowed the creation of server-side apps interacting with the HTTP protocol. That means that the browser could request a resource to the server, which in turn provided it as a HTML page. It was finally possible to write server-side programs that could interact with web browsers, a real game changer for the time. As servlets’ popularity increased, that of Applets started to wane, for it was easier to adopt pure HTML as User Interface and build server-side web pages.

You can also read the second, third and forth parts. Follow us on LinkedIn and stay tuned for upcoming interesting technology articles on our blog!

Thanks to Danilo Ventura for the valuable contribution to this article series.

Read More
mainframes

Today, mainframes are still widely used in data-centric industries such as Banking, Finance and Insurances. 92 of the world’s top 100 banks rely on these legacy technologies, and it is believed that they account for 90% of all global credit card transactions (source: Skillsoft).

This is suboptimal, since relying on mainframes generates high operational costs, calculated in MIPS (million instructions per second). A large institution can spend more than $16 million per year, estimating the cost for a 15.200 MIPS mainframe (source: Amazon Web Services).

In addition, mainframes come with technical complexities, like the reliance on the 60-year old COBOL programming language. For organizations, this means not only reduced data accessibility and infrastructure scalability, but also the problem of finding skilled COBOL programmers at a reasonable cost - more info here

Moreover, as consumers are now used to sophisticated on-demand digital services  - we could call it the “Netflix effect”, by which everything must be available immediately and everywhere. Thus banking services, such as trading, home banking, and financial reports need to keep the pace and offer reliability and high performances. In order to do that, large volumes of data must be quickly accessed and processed from web and mobile applications: mainframes may not be the answer. 

Mainframe Offloading to the rescue

Mainframe Offloading can solve the conundrum. It entails replicating the mainframe data to a parallel database, possibly open source, that can be accessed in a more agile way saving expensive MIPS. As a sort of “Digital Twin” to the mainframe, the replicated data store can be used for data analysis, applications, cloud services and more. 

This form of database replication provides significant advantages both in flexibility and cost reduction. Whenever an application or a service needs to read customers’ data, it can access the parallel database without having to pay for expensive mainframe MIPS. Moreover, the mere offloading paves the way for a progressive migration to the cloud, e.g. entailing bidirectional replication of information between the open source cloud database and the datacenter.

Offloading data from the mainframe requires middleware tools for migration and integration. Apache Kafka can be leveraged as a reliable solution for event streaming and data storage, thanks to its distributed and replicated log capabilities. It can integrate different data sources into a scalable architecture with loosely coupled components. 

Alongside the event streaming platform, CDC (Change Data Capture) tools are also to be considered to push data modifications from the mainframe into the streaming platform. CDC is a software process that automatically identifies and tracks updates in a database. It allows to overcome the limitations of batch data processing in favour of a near-real time transfer. While IBM and Oracle offer proprietary CDC tools, such as InfoSphere Data Replication and Oracle Golden Gate,  3rd party and open-source solutions are also available, like Qlik Data Integration (formerly known as Attunity) and Debezium

From Offloading to Replacement

As a heuristic process for perfectibility, Mainframe Offloading can also be seen as a starting point to mainframe replacement proper, with both applications and mission-critical core banking systems running in the cloud. This would mean that the expensive monolithic architecture gives way to modernization and future-proof, cloud native solutions.

Yet, replacing a mainframe is not an easy nor a quick task. In his blog article “Mainframe Integration, Offloading and Replacement with Apache Kafka”, Kai Waehner hypothesizes a gradual 5-year plan. First, Kafka is used for decoupling between the mainframe and the already-existing applications. Then, new cloud-based applications and microservices are built and integrated in the infrastructure. Finally, some or even all mainframe applications and mission-critical functionalities are replaced with modern technology.

It must be said that it is often not possible to switch off mainframes altogether. For larger institutions, such as major banks, the costs and inconveniences of a full migration may be just too high. Realistically speaking, the most effective scenario would be a hybrid infrastructure in which certain core banking functionalities remain tied to the mainframe, and others are migrated to a multi-cloud infrastructure.

How Bitrock can help

Given the complexity of the operation, it is fundamental to work with a specialized partner with thorough expertise with offloading and legacy migration. In Bitrock we have worked along with major organizations to help them modernize the infrastructure, save costs and support their cloud native transition. By way of example, we have carried out a mainframe offloading project for an important consumer credit company, transferring data from a legacy DB2 to a newer Elastic database. Thanks to the Confluent platform and a CDC system, data are now intercepted and pushed in real time from the core system to the front-end database, enabling advanced use cases

If you want to know more about this success story or how we can help you with your journey from legacy to cloud, please do not hesitate to contact us!

Read More

The Covid-19 pandemic has changed healthcare forever. Among other things, it demonstrated the importance of data in the health sector. The global response to the pandemic showed that offering high-quality patient care depends on accessing and sharing large amounts of sensitive information in a secure manner - let us think of the logistic complexities of carrying out clinical trials, mass testing, and vaccinations under strict time constraints.

Today, accessing reliable data allows medical practitioners and institutions to pursue a patient-centered approach, i.e. to ensure adequate, personalized services at any time. We are witnessing a paradigm shift, which impacts both patient care quality and financial sustainability. Regardless of each nation's welfare system, residual or institutional, individualized healthcare is indeed one of the most effective ways to reduce costs and redirect money where it matters most - R&D, hirings, technology, facilities. 

The blockchain technology has the chance to play a critical role in supporting the data-driven evolution of healthcare. Thanks to its immutability and decentralization, the distributed ledger can ensure secure information exchange between healthcare providers and stakeholders. Especially in highly fragmented healthcare systems, it may offer interoperability and disintermediation of trust in the collection and management of data. This in turn enables greater agency for patients, who are empowered to access personal information in a simple, transparent manner. 

Blockchain and its applications - Smart Contracts, NFTs - can thus have a disruptive impact in some critical areas of contemporary healthcare, which the paper "Applications of Blockchain Within Healthcare" in the peer-review journal Blockchain in Healthcare Today identifies as

  • Drug Tracking - necessary to prevent diversion, counterfeit and overprescriptions throughout the supply chain
  • Healthcare Data Interchange - the integration of health data among different stakeholders such as hospitals, insurances, national health systems 
  • Nationwide Interoperability - i.e. ensuring access to health record across different incompatible service providers
  • Medical Device Tracking - to increase the efficiency of inventories and save money spent in repurchasing unnecessary devices 

Considering the centrality of these areas of intervention, it is easy to see why it is a booming business. The global market size of blockchain applications in healthcare was valued $1,5B in 2020, and it is estimated to reach $7,3B by 2028 (Source: Verified Market Research).

Let us take a look at 4 exciting use cases for blockchain-powered patient care.

Pharmaceutical Supply Chain Management

Counterfeit drugs are a significant problem, especially in developing countries. While figures are somehow difficult to come by, a 2019 OECD/EUIPO report estimated that the global market value of counterfeit pharmaceuticals amounted to $4,4B in 2016. We are talking about 0,84% of global imports in medicines.

In this regard, the implementation of blockchain-enabled services to track pharmaceuticals can offer transparency and security through the entire chain of custody. The immutability of the distributed ledger guarantees the authenticity of medical products from manufacturers to the pharmacist and patient.

In addition to the increase in traceability, blockchain-powered supply chain solutions can also increase efficiency and reduce costs thanks to AI/ML. Advanced streaming event analysis can detect anomalies in real-time and ensure the timely delivery of pharmaceuticals. 

Electronic Prescriptions

One of the most crucial stages of the pharmaceutical chain of custody is the prescription to the patient. Here, errors and communication mishaps can have devastating effects on a treatment plan. Let us also consider that in the US, 62% of prescriptions for controlled drugs are handwritten (source: Statista), which increases the risk of mistakes and prevents any automated safety feature. 

Blockchain-enabled electronic prescription systems can support healthcare providers in delivering a tailored service that takes into account patients' specific needs and clinical history. By integrating data from health records into a shared, secure database, blockchain can help prescribers check for allergies, interactions, and overprescriptions - also to avoid drug abuse or diversion.

The paper "Use of Blockchain Technology for Electronic Prescriptions" (2021), published in Blockchain in Healthcare Today, recounts an e-prescription pilot programme carried out in Tennessee clinics between 2021 and 2022. Switching to a blockchain-based electronic system automatized instantaneous patient safety checks (interaction, allergies), which resulted in practitioners changing the prescription 28% of the time. It also allowed them to save significant time - a mean of 1 min 48 sec per written prescriptions.

Electronic Health Record

No effective real-time patient safety check can be carried out without a reliable Electronic Health Record (EHR) system. Allowing patients and practitioners to securely access health records is fundamental both for transparency and clinical reasons. According to a Johns Hopkins study, medical errors, often resulting from uncoordinated or conflicting care, are currently the third cause of death in the US.

Yet, that is one of the very countries in which the theoretical effectiveness of EHR system is hampered by the fragmentation and lack of interoperability of service providers. It is estimated that there currently exist at least 500 vendors of EHR products - other sources claim more than 1 thousand! - with the average hospital running 16 platforms simultaneously.

The blockchain technology can thus be used as a way to connect different data sources and create a secure, decentralized ledger of patient records. In the words of a 2021 study carried out by the US Department of Health and Human Services, "Blockchain-based medical record systems can be linked into existing medical record software and act as an overarching, single view of a patient’s record".

Device Tracking & IoMT

The transparency and security of blockchain can benefit the management of medical devices throughout the chain of custody, from manufacturer to the hospital and patient. A thorough tracking of medical assets can help identification and verification against counterfeits and unapproved products, as with drugs. It also offers significant financial advantages, saving hospitals' money spent repurchasing devices. 

Implementing blockchain-based solutions for tracking medical devices can integrate with the widespread RFID technology, i.e. radio-frequency identifiers. These cost-effective tags, both active and passive, are currently employed to digitalize the inventory of medical items and drive the effectiveness of resource management. RFID-generated data can thus be transferred to the immutable ledger, to store in a secure and compliant way the history, lifecycle, and salient features of tracked devices. 

Blockchain can also provide data integrity for the messages exchanged via sensors and devices employed by patients at home. Remote monitoring is becoming more and more important for telehealth, also thanks to the increased availability of high-speed wireless connectivity (5G) - and this raises concerns about cybersecurity. The blockchain technology can limit unauthorized data access and manipulation, guaranteeing at the same time high transparency and agency for patients. 


These are just but a few use cases for blockchain in the healthcare sector. Other potential applications leverage smart contracts for service payment or trustless communication with insurance companies, to name a few. And other use cases will certainly appear in the next future, while the technology continues to develop and spread. What is certain is that the blockchain has the potential to transform healthcare for good, and help improve all stages of the patient journey.

Bitrock has global expertise and proven experience in developing blockchain-based solutions and applications. If you want to know more about our consultancy services, and learn how we can jumpstart your blockchain project, book us a call and we'll be happy to talk!

Author: Daniele Croci, Digital Marketing Specialist @ Bitrock

Read More

It all started with a horse. In 2006, Bethesda popularized the notion of microtransactions in gaming with the launch of the (in)famous horse armor DLC - i.e. downloadable content - for The Elder Scrolls 4: Oblivion. For the somehow affordable cost of $2,50, PC and Xbox 360 players could unlock a cosmetic add-on for an in-game asset, that is the horse used by the player's diegetic surrogate. It did not provide any significant gameplay advantage, just a shiny metal armor to brag about with oneself in a single-player game.

It was not the first time players had had the chance to buy in-game items for real-world money. Microtransaction had been around since the democratization of internet F2P (free-to-play) gaming, featuring for instance in Nexon's Maplestory (2003), or proto-metaverse experiences such as Habbo (2001) or Second Life (2003). The last two, in particular, pioneered the offering purchasable cosmetic items for players who wanted to differentiate themselves in crowded online multiplayer spaces. 

And let us not forget Expansion Packs, full-fledged additional experiences that could be bought and added to videogames for more plot, quests, items and hours of entertainment, and that first came on physical media and only later via digital download. Some notable examples include Warcraft 2: Beyond the Dark Portal, a 1996 expansion to the wildly popular Warcraft 2: Tides of Darkness (1995), and The Sims: Livin' Large (2000), released in the same year as the original life simulation game. 

Even though we cannot underestimate Expansion Packs' role in transitioning the gaming industry from a Game-as-a-Product towards a Game-as-a-Service (GaaS) business model, today they have somehow waned in favor of parcelized microtransactions and DLCs. These forms of continuous content have now become dominant, with add-ons - both functional and cosmetic - coming at a lower, less affordable price for players and providing a consistent revenue stream to publishers. Even Bethesda's scandalous Horse Armor proved successful at the end of the day. 

The financial advantages of the GaaS model is still more evident with F2P games, where the ongoing sale of digital goods constitutes the sole source of revenue for the publisher. These addictive mobile games have often turned into viral phenomena that generate way more money than many conventional $70 AAA products - we are talking about games like Fortnite, which generated $9 billion in revenue in 2018 and 2019, League of Legends, $1.75 billion in 2020, or newcomer Genshin Impact, which is estimated to have totalled $3,5 billion in its first year. Seeing these figures, it is easy to understand how the global gaming industry generated a whopping $54 billion in 2020 with in-game purchases only - and numbers are only projected to increase (source: Statista). 

Mobile gamer

NFTs to overcome the limitations of DLCs

However, microtransactions and in-game purchases as we know them have a major limitation. When a horse armor is bought in a game, it stays in the game. It is not really an asset owned by the player, but rather a service that is accessed only in the context of the title that originated it. If we are talking about online-only multiplayer games, as many F2P are, the purchase practically ceases to exist when the game servers are shut down. Furthermore, digital assets bought or earned via gameplay cannot be normally exchanged on secondary markets for real world money - while there currently exist some under-the-desk reselling of items in some MMORPGs, like World of Warcraft, it is a risky practice that tends to go against End-Users License Agreements and leads to inglorious bans. 

This is where blockchain and NFTs come into play. Non Fungible Tokens allow players to acquire true ownership of the assets they have bought or earned in game, opening up to  collecting, exchanging and reselling. In a word, stronger players' engagement, fuelled by the Copernican revolution in the flow of value. Companies are no longer the sole beneficiaries of the gaming economy, with players empowered to (re)claim the value of their money or time investments. 

All this is possible thanks to tokenization, enabled by the blockchain technology. The term refers to the process of converting an asset, digital or physical, into a virtual token that exists and circulates via the blockchain. In this sense, tokens are representations of assets (money, real estate, art - you name it) that store information in a transparent, efficient, and secure way via the blockchain's immutable ledger. This allows all users to trace not only the token’s provenance, but also the history of transactions carried out by the users. 

NFTs are a special kind of tokens characterized by being - well - non-fungible, meaning that they are endowed with individuality as such and cannot be interchanged with another one. A Bitcoin is the same as every other Bitcoin, just like a dollar is the same as every other dollar. A NFT, by contrast, has unique and permanent metadata that identify it unequivocally. As a sort of authenticity certificate, this record details the item’s nature and ownership history. Another feature that differentiates NFTs from Bitcoin is indivisibility. It is possible to own a fraction of Bitcoin, while it is not possible to have a quarter of a tokenized work of art or gaming item.

All these features suggest why tokenized game assets can offer significant benefits for the players. Unlocking real ownership for unique items earned and bought redefines a user's relationship with the game, creating a greater sense of engagement that can also exceed the barriers of the game itself. Indeed, the interoperable nature of NFTs means that the gaming items can also be virtually transferred to and reused in other connected games, provided that the game engine and framework supports such functionality. In addition, blockchain-enabled games offer the chance to monetize item ownership in a legitimate way via reselling. We are witnessing the rise of the play-to-earn model, where gaming leads to the acquisition of NFTs that can be later sold for legitimate income. 

And benefits are not only limited to the players. Secondary trading of gaming NFTs may also generate immediate revenues for gaming companies via royalties inscribed within tokens themselves. This is one of the most exciting features of NFTs in general, with huge applications for the art world. In a nutshell, it is technically possible to mint a token in a way that automatically guarantees the payment of royalties to the original creator whenever the token is traded between third parties. The system still needs perfecting, being currently some limitations due to the interoperability between different platforms - more info here -, but it nonetheless is a great way to potentially ensure a fair distribution of profits between owners and creators. 

Tezos NFT Gaming

NFT Games & dApps to Know

To understand the impact of NFTs in the gaming industry, we need to consider at least two different applications: on the one hand, play-to-earn games that are structured upon the blockchain technology and NFTs, and which often are Decentralized Apps, or dApps; on the other hand, conventional games that variously adopt and integrate NFTs as part of the videoludic experience, without depending on them. 

The most popular dApp game is arguably Axie Infinity (2018), developed by Vietnamese studio Sky Mavis. It is a Pokemon-inspired online RPG where players can breed and fight their NFT creature called Axies. Available for mobile and PC, the game was initially based on Ethereum. Yet, Sky Mavis later launched their sidechain Ronin, optimized for NFT gaming due lower gas fees and transaction time (more info here). Axie Infinity can be said to fully leverage the blockchain possibilities by also integrating fungible tokens called ASX and SLP, which serve as in-game currency and can be traded like every other cryptocurrency on the market.

Despite the steep entry price - 3 competitive starting Axies can cost the player $300 and more -  Axie Infinity has quickly become a huge phenomenon. It is played each month by 2,8M users, with 10M total players estimated in December 2021. Even more staggering is the overall value of Axies NFTs transactions carried out on Ethereum and Ronin, which in March 2022 reached 4,17 billion dollars!

Due to the features of NFTs, collecting and trading are central in many - if not all - dApp games. CryptoKitties (2017) is another example of a game that focuses on breeding and exchanging NFT pets - there is no other discernible gameplay feature . It is often mentioned as the gateway to blockchain gaming for many players. Immutable's Gods Unchained (2021) is a trading card game à la Magic the Gathering that offers real ownership of the NFT virtual cards. It leverages Immutable X, the company's own Layer 2 Scaling Solution for NFTs that allows reducing gas fees. Gods Unchained also features its own homonym cryptocurrency, that enables players to buy card packs and vote in governance proposals that influence the game’s own development. It’s a growing phenomenon: the company reports 80k weekly players in January 2022, with $25 million in Gods Unchained assets traded on Immutable X.

Compared to the huge success of dApps, the relationship of traditional gaming companies and players with NFTs has been less straightforward. Quartz, Ubisoft’s proprietary platform for NFTs - or Digits, as they rebranded them - has been welcomed with mixed feelings since its launch in December 2021. The platform, now in beta, allows players to buy or earn some cosmetic items for the PC version of Tom Clancy’s Ghost Recon Breakpoint, which in turn can be resold on third-party marketplaces. Quartz is based on Tezos, a proof-of-work blockchain that, according to the publisher, “needs significantly less energy to operate. As an example, one transaction on Tezos consumes as much energy as 30 seconds of video streaming while a transaction on Bitcoin consumes the equivalent of one year of video streaming”.

Quartz’s lukewarm reception can be attributed to different factors. First of all, inaugurating the project with only one game, the PC version of a poorly-received 2019 shooter - 58 on Metacritic, with overwhelmingly negative user reviews. Secondly, limiting the acquisition of NFTs to Ghost Recon Breakpoint players that have reached a certain level in the game, effectively leaving out collectors and enthusiasts. Third, publishing sets of cosmetic items that look all the same and are merely differentiated with a serial number. However, despite all this, all Breakpoint NFTs appear to be sold out as of late March 2022.

Another gaming company that has been struggling with NFT implementation is GSC Game World, the Kiev-based developer behind the renowned S.T.A.L.K.E.R. series. On 15 december 2021, they announced that the upcoming S.T.A.L.K.E.R. 2: Heart of Chernobyl would include tokens to be purchased on DMarket, with the most prized allowing its owner to actually become a NPC in the game. The announcement garnered negative feedback from the community, which prompted GSC to backpedal on the very following day: “we’ve made a decision to cancel anything NFT-related in S.T.A.L.K.E.R. 2”. 

Konami had greater success with the launch of a NFT collection dedicated to the Castlevania series. The tokens were not items to be used in games, but 14 unique pictures and videos commemorating the 35 year-old franchise. Players seem to have appreciated the nostalgic set, which was auctioned off for ~$160k in total. This achievement prompted Konami to plan more NFTs for the future, as mentioned in their Outlook for the Fiscal Year Ending March 31, 2022.

Phenomena like Axie Infinity or Gods Unchained, and Ubisoft, GSC and Konami’s varied experiences with the blockchain demonstrate one thing: the gaming world is interested in NFTs when these enhance the experience and provide players with unique, valuable prizes that reflect their own passion and dedication. Gaming is a sophisticated medium, and gamers are sophisticated audiences. We have come a long way since the horse armor, and slapping a serial number on mass produced virtual items may not be enough. Today, integrating NFTs within a video ludic experience must be underpinned by a well-designed strategy that takes into account the specific features of the medium. 

Within this strategy, technological choices take on a primary importance. The variety of standards - paired with the lack of well-established business models - may hinder gaming companies’ efforts at creating scalable, flexible and environmentally sustainable blockchain solutions. This is why working with a reliable partner is increasingly important.


As a high-end tech consulting company, Bitrock has global expertise in supporting gaming and non-gaming companies for NFT, Smart Contract, cryptocurrencies and blockchain-enabled projects. Thanks to our integrated offering, we can accompany you from the project definition and to the deployment of the last line of code and optimization of the user interface.

To know more about how we can help you, contact us now!

Author: Daniele Croci, Digital Marketing Specialist @ Bitrock

Read More

These last couple of years have taught an important lesson to all Data & Analytics specialists: agility is the key. Being able to pivot between different design patterns and approaches is increasingly important to thrive through supply chain volatility, accelerated digitalization, and disruption of business operations.

 To turn these challenges into opportunities, and stay ahead of competition, companies must revise antiquate models based on centralized, static data. The centrifugal shift towards distributed architectures and multi-cloud infrastructures, emerged a few years ago, has today found its cultural equivalent in new, decentralized approaches to Data & Analytics. At the same time, the possibility to analyze data in motion in a dynamic manner allows to integrate actionable insights into decision making and business operations.

 Let’s take a look at some of the most interesting Data & Analytics trends that have emerged or consolidated recently, and how these can create value for organizations in the next future.

Small & Wide Data

We have come to realize that Big Data is not always the answer. Accumulating information can lead to data sourcing and quality issues, plus requiring the implementation of deep learning analytical techniques whose cost and complexity may outweigh the results. We have also seen how quickly data can become irrelevant – companies run the risk of hoarding stale, useless information that cannot provide significant value.

Small & Wide Data have emerged as innovative approaches to enable the generation of valuable insights via less voluminous, more varied data. The former approach eschews data-hungry models in favor of tailored analytical techniques relying on limited amounts of data. The latter leverages the integration of heterogeneous sources, both structured and unstructured, instead of a larger single one.

 Small & Wide Data can enable the access to advanced analytics and AI for smaller players, which cannot rely on enough information for conventional Big Data techniques. But bigger companies can also benefit from these approaches. As Gartner suggests, 70% of organizations will shift their focus from big to Small and Wide data by 2025.

Data Mesh

The current shifts towards decentralization and microservices can be said to underpin the very notion of Data Mesh. First introduced by Zhamak Dehghani in her 2019 article “How to Move Beyond a Monolithic Data Lake to a Distributed Data Mesh”, it purports to overcome the limitations of gargantuan Data Lakes and their reliance on hyper-specialized teams and financially-untenable ETL pipelines.

 By contrast, Data Mesh can be seen as an organizational and architectural model that allows for a distributed, domain-driven data ownership. This ubiquitous data platform empowers cross-functional teams to operate independently, while offering greater flexibility and interaction between distributed datasets.

It is worth noting that the distributed Data Mesh architecture stems from a paradigm shift, rather than a Copernican revolution. It does not ditch altogether data lake advantages and principles – centralization is in fact retained for governance and open standards – but evolves them to increase business agility and reduce time-to-market.

Image by Zhamak Dehghani via martinfowler.com

Continuous Intelligence

Continuous Intelligence leverages Event Stream Processing and Real-time Analytics to integrate actionable insights into decision-making and business processes. This design pattern turns analytics into a prescriptive practice: ingest large volumes of data in motion – from sources like IoT, eCommerce transactions, traffic, weather – and leverage them to augment or even automate human decisions.

CI enables companies to analyze data on the fly, identify trends and root causes, and make real-time decisions that allow strategic differentiation in competitive, saturated markets. It is a transformative model that provides a plethora of opportunities – from detecting fraud in finance and improving customer experience in retail to implementing predictive maintenance in manufacturing and more. CI can be also employed to connect different branches and departments in a company, to share & leverage data in real time, optimize decision-making and thus increase productivity.

Thanks to its partnership with Radicalbit, Bitrock can integrate its D&A consulting services with Helicon, a cutting-edge Continuous Intelligence platform. This code-free SaaS solution enhances the management of data streaming pipelines with Machine Learning, dramatically accelerating the development of real time advanced analytics (descriptive, diagnostic, predictive and prescriptive). The platform offers efficient features such as a stream processing pipelines visual editor, with debugging capabilities, data exploration, and real-time ML monitoring, enabling the adoption of the Continuous Intelligence paradigm.

Analytics at the Edge

Speaking of IoT, a recent D&A trend concerns the decentralization of the very location in which the collection and analysis of data takes place. Edge Analytics means distributing information, analytics and technology closer to – or possibly within – the physical assets, i.e. the edge. In other words, it entails the possibility of avoiding in part or altogether the transfer to data centers and cloud environments, increasing the flexibility of the whole data infrastructure.  

It is a growing trend – Gartner foresees that, by 2023, more than 50% “primary responsibility of data and analytics leaders will comprise data created, managed and analyzed in edge environments”. The reasons are multiple: for instance, provisioning analytics to the edge can have a positive impact on the speed in which data is processed, with actionable insights being generated in real-time. Stability is another case in point: avoiding data transfer means less disruption from connectivity issues. Finally, we have to consider compliance – leaving data “where it is” reduces the headaches deriving from different national regulations and governance policies.

For these reasons, Analytics at the Edge can bring significant benefits to a wide array of applications. Automotive risk mitigation is, for instance, a business case in which analyzing data in real time is fundamental to avoid collisions or breakdowns. Healthcare, on the other hand, can simplify the management of personal, sensitive data if this is not moved to cloud services or data centers located under different jurisdictions.

Data Democracies

The notion of Data Democracy concerns the creation of an ethical and methodological framework that removes the technological barriers to informed data management. It revolves around the principle that people, regardless of their technical know-how, should be able to access and trust available information during their daily operations.

The democratization of data impacts any kind of business organization, and bears upon both personnel and technology. Lowering the barrier to data means first of all offering upskilling programs aimed at data literacy development, whatever the function or seniority within the company. It also means rethinking data silos in favor of more flexible and transparent architectural models, such as Data Meshes (see above). Finally, it entails implementing efficient analytics and Business Intelligence tools on a company-wide level. One example is Sense, by our partner Qlik, that enables advanced, ML-powered analytics while helping develop data literacy.

As a real cultural shift, a Data Democracy can offer significant benefits to a company’s internal operations. It empowers non-technical employees to make fast, informed decisions without the support of IT or data experts – think of how this can help Product, Marketing, Sales team generate more value and save resources. Moreover, developing a corporate data culture may have a positive impact on an organization’s relationship with its stakeholder and the public at large. Data ethics informs data governance policies that promote privacy, cybersecurity, and a righteous management of customer data.


These are only some of the opportunities offered by the latest trends in Data & Analytics. If you want to know how Bitrock can help your company evolve its data strategy, and stay ahead of competition with cutting-edge solutions, send us a message – we’ll be happy to book a call!

Author: Daniele Croci, Digital Marketing Specialist @ Bitrock

Read More
Introduction to HC Boundary

Secure your access in a dynamic world

The current IT landscape is characterized by multiple challenges and quite a bit of them are related to the increasing dynamicity of the environments IT professionals are working in. One of these challenges is securing access to private services. The dynamic nature of the access manifests itself on multiple levels:

  • Services: they tend to be deployed in multiple instances per environment
  • Environments: hosts, where the workload is deployed, can change in a  transparent way to the final user
  • Users: people change role, people come and go from a team
  • Credentials: the more often they are changed, the more secure they are

Tools developed when this kind of dynamism  was not foreseeable are starting to show their limitations. For example, accessing a service would often mean to provide networking access to a subnet where careful network and firewall policies need to be set up. The resulting access is allowed to a user independently from their current role. 

Zero trust in a dynamic environment

A Zero trust approach is highly desirable in every environment. Being able to assume zero trust and granularly providing access to resources with role based rules without the need to configure delicate resources like network and firewalls are paramount in a modern IT architecture.

This is even more so in a dynamic environment, where the rate of change can put under pressure the security teams and their toolchains as  they try to keep access configurations up to date. 

Boundary to the rescue

In the following diagram we can see how HashiCorp’s Boundary is designed to fulfill the requirements of granting secure access in a zero trust environment. The access to a remote resource is granted by defining policies on high level constructs that encapsulate the dynamic nature of the access.

The main components are:

  • Controller (control plane): the admin user interacts with the controller to configure access to resources. The normal user interacts to ask for authentication / authorization.
  • Worker (data plane): the connection is established between the local agent and the remote host by passing through this gateway that allows for the connection based on what the controller allows.
  • Local Agent: interact with the controller and the worker to establish the connection.

Boundary concepts

Identity is a core concept in Boundary. Identity is represented by two types of resources, mapping to common security principals:

  • Users, which represent distinct entities that can be tied to authentication accounts
  • Groups, which are collections of Users that allow easier access management

Roles map users and groups to a set of grants, which provide the ability to perform actions within the system.

Boundary's permissions model is based on RBAC and each grant string is a mapping that describes a resource or set of resources and the permissions that should be granted to them.

A scope is a permission boundary modeled as a container. There are three types of scopes in Boundary: 

  • a single global scope: which is the outermost container
  • organizations: which are contained by the global scope
  • projects:  which are contained by orgs

Each scope is itself a resource.

Boundary administrators define host catalogs that contain information about hosts. These hosts are then collected into host sets which represent sets of equivalent hosts. Finally, targets tie together host sets with connection information.

Boundary interfaces

Boundary offers multiple interfaces to interact with the tool:

  • a CLI that we DevOps engineers love
  • a user friendly Desktop application
  • a Web UI for the server

Integration is key

So how can this be kept up to date with the current dynamic environments?

The answer lies in the integrations that are available to add flexibility to the tool: specifically when it comes to the authentication of users, the integration with an identity provider with standard OIDC protocol can be leveraged. When it comes to credentials, the integration with HashiCorp Vault surely (pun intended) covers the need of correctly managed secrets with their lifecycle (Vault Credentials Brokering). Finally, when we talk about the list of hosts and services we can leverage the so-called Dynamic Hosts Catalog. The catalog can be kept up to date in a push mode by using the integration with HashiCorp Terraform or in a pull mode by interacting with HashiCorp Consul.

Want to get your feet wet?

Seems like this tool is providing a lot of value: so why not integrate it into your environment? We are already planning to add it into our open source Caravan tool.

There’s a high chance for you to get your feet wet playing with Boundary and other cool technologies, don’t be shy, join us on (the) Caravan


Discover more on Zero Trust in our upcoming Webinar in collaboration with HashiCorp

When: Thursday, 31st March 2022
Where: Virtual Event
More details available soon - Follow us on our Social Media channels to find out more!

Read More
Caravan Series PT4

This is the fourth and last entry in our article series about Caravan, Bitrock’s Cloud-Native Platform based on the HashiCorp stack. Read the first, second and third part on our blog.

The communication layer between application components running on top of Caravan leverages HashiCorp Consul to expose advanced functionalities. Service discovery, health checks, and service mesh are the key features that Consul enables in Caravan.

Service Discovery & Health Checks

While Consul makes it easy to lodge services in its registry, it offers a painless discovery process thanks to the different ways of inspection, such as API, CLO or DNS SRV queries. 

The service registry would not be complete without the health checking capabilities. It is possible to set up different kinds of health checks, to inspect whether a service is healthy and thus can be shown as available in the registry. When a health check fails, the registry no longer returns the failed instance in the client queries. In this way the consumer services stop making requests to the faulty instance.

Caravan Logo

Consul Connect with Envoy

Consul Connect provides Authorization and Encryption of the communication between services using mutual TLS. Applications are not aware of Consul Connect thanks to sidecar proxies deployed next to them to compose a Service Mesh. These proxies "see" all service-to-service traffic and can collect data about it. 

Consul Connect uses Envoy proxies and can be configured to collect layer 7 metrics and export them to tools such as Prometheus. Connect uses the registered service identity (rather than IP addresses) to enforce access control with intentions. Intentions declare the source and the destination flow where the connection is allowed - by default all connections are denied following the Zero Trust principles.

Within the Service Mesh, incoming and outgoing communication traffic is handled with a dedicated component called Gateway. The Gateway is secure by default, it encrypts all the traffic and requires explicit intentions to allow the requests to pass through. 

Service Mesh in Nomad

Nomad thoroughly integrates with Consul, allowing the specification of Consul configurations inside the Nomad job description. This way operators can define in a single place all the configurations needed to run a Nomad task and to register it in Consul, making it available to other components running in the platform.  In detail, Nomad agent automatically registers the service in Consul, sets up its health check, requests dynamic short-lived TLS certificates for a safe in-mesh communication enabled by the Envoy sidecar proxy, whose lifecycle is managed directly by Nomad without any manual intervention required.


Want to know more about Caravan? Visit the dedicated website, check our GitHub repository and explore our documentation.

Authors: Matteo Gazzetta, DevOps Engineer @ Bitrock - Simone Ripamonti, DevOps Engineer @ Bitrock

Read More
PNRR Bitrock

Non c’è alternativa”, recitava un vecchio slogan politico che portò alla creazione del governo più duraturo del Novecento. Oggi, il medesimo mantra si può applicare alle molteplici necessità di innovazione e digitalizzazione del tessuto produttivo italiano, che si (ri)affaccia sul mercato globale al termine, si spera, della crisi pandemica già affardellato da decennali cali di produttività e competitività. Per chi vuole prosperare nuovo scenario, il cambiamento tecnologico rappresenta un principio cogente.

In questo senso, il Piano Nazionale di Ripresa e Resilienza (PNRR) costituisce una opportunità rilevante. Elaborato in risposta alla grave crisi economica e sociale innescata dal Covid19, prevede l’allocazione di 191,5 miliardi di euro in una serie di interventi atti a rilanciare la fragile economia italiana e stimolare l’occupazione. Gli ambiti spaziano dallo sviluppo della mobilità sostenibile, alla transizione ecologica e all’inclusione di gruppi sociali ulteriormente marginalizzati dalla precarietà lavorativa.

Transizione digitale 4.0 per il sistema Italia

La prima missione del PNRR mette al centro “Digitalizzazione, Innovazione, Competitività, Cultura e Turismo”, valorizzando i concetti chiave che fungono da leitmotiv per l’intero Recovery Plan. Prevede lo stanziamento di 40,32 miliardi di euro per un programma di transizione digitale che interessa sia il settore pubblico sia quello privato. 

L’obiettivo è quello di sostenere lo sviluppo e la capacità competitiva di un sistema paese che, al momento, si posizione al 25mo posto (su 28) nel Digital Economy and Society Index (DESI). Come ricorda il PNRR (pag. 83), tale arretratezza fa il paio con il calo di produttività che ha caratterizzato l’economia italiana nell’ultimo ventennio, a fronte di una tendenza positiva nel resto del continente europeo. Questa contrazione è sovente legata alla ridotta innovazione digitale delle piccole e medie imprese, che rappresentano il 92% delle aziende e impiegano l’82% dei lavoratori in Italia (Il Sole 24 Ore).

La missione si articola in tre componenti:

  1. Digitalizzazione, Innovazione e Sicurezza nella PA (9,75 Mrd)
  2. Digitalizzazione, Innovazione e Competitività del Sistema Produttivo (23,89 Mrd)
  3. Turismo e Cultura (6,68 Mrd)

Vediamo nel dettaglio il secondo punto, cui è dedicato uno dei maggiori investimenti del PNRR.

Digitalizzazione, Innovazione e Competitività del Sistema Produttivo: come funziona

Il programma per il settore privato si prefigge, nelle parole del documento, di rafforzare “la politica di incentivazione fiscale già in corso (studiata per colmare il gap di “digital intensity” del nostro sistema produttivo verso il resto d’Europa – minori investimenti valutabili in due punti di Pil – specie nella manifattura e nelle PMI), che ha avuto effetti positivi sia sulla digitalizzazione delle imprese che sull’occupazione, soprattutto giovanile e nelle nuove professioni” (pag. 98).

Prevede una serie di investimenti e riforme che hanno l’obbiettivo di potenziare la digitalizzazione, innovazione tecnologica e internazionalizzazione del tessuto produttivo e imprenditoriale, con un occhio specifico alle PMI che maggiormente risentono del clima di volatilità contemporanea. 

All'interno del PNRR, il piano di investimento “Transizione 4.0” costituisce un’evoluzione del già noto programma Industria 4.0 del 2017, di cui viene allargato il novero delle aziende potenzialmente beneficiarie. Prevede tra le altre cose l’erogazione di un credito di imposta per società che decidono di investire in

  1. Beni capitali, materiali e immateriali
  2. Ricerca, sviluppo e innovazione
  3. Attività di formazione alla digitalizzazione e di sviluppo delle relative competenze

La prima voce riguarda l’investimento per strumenti “direttamente connessi alla trasformazione digitale dei processi produttivi” – i cosiddetti Beni 4.0 già indicati negli allegati A e B alla legge 232 del 2016 –, e “beni immateriali di natura diversa, ma strumentali all’attività dell’impresa (pag. 99)

Se il primo allegato dettaglia una serie di componenti hardware, tra cui macchinari, utensili e sistemi di monitoraggio, il secondo si concentra su soluzioni software ad alto tasso tecnologico che possono sostenere le aziende in un percorso di crescita scalabile e sostenibile.

Le applicazioni possibili

Integrati all’interno di una visione strategica, le soluzioni hardware e software menzionate nel PNRR possono trovare applicazione in una serie di ambiti, tra cui:

  • La transizione verso il paradigma Cloud Native, un approccio che sfrutta le tecnologie del Cloud Computing per progettare e implementare applicazioni sulla base dei principi di flessibilità, adattabilità, efficienza e resilienza. Grazie a strumenti metodologici e tecnologici come DevOps, container e microservizi, il Cloud Native permette di ridurre il time to market e sostenere l’evoluzione agile dell’intero ecosistema aziendale.
  • La valorizzazione del patrimonio informativo aziendale attraverso l’implementazione di sistemi di Data Analysis in tempo reale, IIoT (Industrial Internet of Things) e Data Streaming che, combinati con Machine Learning e Intelligenza Artificiale, possono essere sfruttati per la manutenzione predittiva, con un evidente ottimizzazione dei costi. Rientrano in questo ambito anche i Digital Twin, le copie virtuali di risorse o processi industriali che permettono di sperimentare in vitro nuove soluzioni e prevenire malfunzionamenti.
  • La cybersecurity, sempre più centrale in un contesto di crescente digitalizzazione di processi e servizi, e di crescente interdipendenza di attori nazionali e stranieri, pubblici e privati all’interno della catena del valore digitale.

Questi percorsi di maturazione digitale possono essere rilevanti sia per le grandi realtà, sia per le PMI che faticano maggiormente a tenere il passo con l’evoluzione tecnologica e la competizione internazionale. Lo sforzo è premiato: come riporta l’Osservatorio innovazione digitale PMI del Politecnico di Milano, le aziende medie e piccole digitalizzate riportano in media un incremento del 28% nell’utile netto, con il margine di profitto più alto del 18% (La Repubblica).

Perché quindi le aziende non digitalizzano? Il problema, spesso, è nella mancanza di personale qualificato. La carenza di staff qualificato affligge il 42% delle PMI italiane (La Repubblica), e la cifra sale al 70% se prendiamo in esame l’intero tessuto produttivo europeo (Commissione Europea). Un altro possibile fattore bloccante concerne la renitenza all’abbandono o evoluzione di sistemi legacy già consolidati all’interno dei processi aziendali.

Questi sono solo alcuni dei motivi per cui è fondamentale affiancarsi a un partner qualificato, che possa accompagnare l’azienda nella pianificazione degli investimenti tecnologici e digitali resi possibili dal PNRR (e non solo).

Bitrock ha competenze certificate ed esperienza internazionale per offrire soluzioni su misura che innovano l’ecosistema tecnologico e digitale, mantenendo gli investimenti legacy del cliente. Il know-how specializzato in ambito DevOps, Software Engineering, UX&Front-End e Data&Analytics è la chiave per affrontare il percorso di evoluzione digitale, con al centro i valori di semplificazione e automazione che generano valore duraturo.

Per conoscere nel dettaglio come possiamo supportare la tua azienda, contattaci subito!

Read More

This is the third entry in our article series about Caravan, Bitrock’s Cloud-Native Platform based on the HashiCorp stack. Check the first and second part.

Caravan heavily relies on the features offered by HashiCorp Vault. Vault is at the foundation of the high dynamicity and automation of Caravan. We may even say that Caravan would have not been the same without Vault, given its deep integration with all the components in use.

In this article, we show some of the Vault features that Caravan relies on.

PKI Secrets Engine

The PKI secrets engine generates dynamic X.509 certificates. It is possible to upload an existing certification authority or let Vault generate a new one, and in this way Vault will fully manage its lifecycle. This engine replaces the manual process of generating private keys and CSRs, submitting them to the CA, and waiting for the verification and signing process to complete. By using short TTLs it is even less likely that one needs to revoke a certificate, thus CRLs are short and the entire system easily scales to large workloads.

In Caravan we use Vault’s PKI to sign both Consul Connect mTLS certificates and server-side (eg. Consul and Nomad) certificates for TLS communications. 

Consul & Nomad Dynamic Secrets

Dynamic Secrets are a key feature of Vault. Their peculiarity is the fact that the secrets do not exist until they are read, so there is no risk of someone stealing them or another client using the same secrets. Vault has built-in revocation mechanisms: this way dynamic secrets are periodically revoked and regenerated, minimizing the risk exposure.

Vault integrates dynamic secrets with different components:

  • Cloud providers (e.g.  AWS, Azure, GCP, …)
  • Databases (e.g. PostgreSQL, Elasticsearch, MongoDB, …)
  • Consul
  • Nomad
  • and many more…

In Caravan, we use the dynamic secrets engine for the generation of access tokens for both Consul and Nomad agents. First of all, we define in Vault the needed Consul and Nomad roles with the needed permissions, and then we map them to Vault roles. This way, we allow authenticated Vault entities to request Consul and Nomad tokens with the permissions defined in the associated role. For example, we set up Nomad Server role and Nomad Client role, with different authorization scopes.

Caravan Logo

Cloud Auth Backends

Distributing access credentials to Vault clients might be a difficult and sensitive task, especially in dynamic environments with ephemeral instances. Luckily for us, Vault addressed this operation and simplified it a lot in the cloud scenario. Vault implements different auth methods that rely on the cloud provider for the authentication of Vault entities.

For example, when running Vault with AWS instances, it is possible to authenticate the entities according to their associated AWS IAM role. Vault leverages AWS APIs to validate the identity of the clients, using the cloud offered primitives. This way, a Vault client running in an AWS instance does not need to know any Vault-related access credentials to access Vault, instead, AWS directly validates the identity of the client. The same logic applies also to other cloud providers such as Azure, GCP, and many more.

In Caravan, we rely on cloud auth backends to authenticate both the server-side and client-side components of the platform. This way, we no longer need to distribute credentials to the spinned instances, which would be a difficult and tedious task. 

Vault Agent

Vault Agent is a client daemon that provides useful functionality to clients who need to integrate and communicate with Vault without changing the client application code. Vault Agent allows for easy authentication to Vault in a wide variety of environments. Vault Agent allows client-side caching of responses containing newly created tokens and responses containing leased secrets generated off of these newly created tokens. Vault Agent allows rendering of user-supplied templates by Vault Agent, using the token generated by the Auto-Auth.

In particular, Caravan relies on Vault Agent templates to render configuration files for a variety of components. For example, the configuration file of Nomad agents is a template rendered by Vault Agent, since it contains dynamic secrets like the Consul token and the TLS certificates used for communication with the server components.


Want to know more about Caravan? Visit the dedicated website, check our GitHub repository and explore our documentation.

Authors: Matteo Gazzetta, DevOps Engineer @ Bitrock - Simone Ripamonti, DevOps Engineer @ Bitrock

Read More
Bitrock & CNCF

We are thrilled to announce that Bitrock has recently joined the Cloud Native Computing Foundation (CNCF), vendor-neutral home for many of the fastest-growing projects on GitHub (including Kubernetes and Prometheus) that fosters the collaboration between the industry’s top developers, end users, and vendors.

Kubernetes and other CNCF projects have quickly gained adoption and diverse community support in recent years, becoming some of the most rapidly-moving projects in the open-source history.

The CNCF, part of the non-profit Linux Foundation, creates long-term ecosystems and fosters a community around a collection of high-quality projects that orchestrate containers as part of microservice architectures.

Cloud Native

Organizations can use cloud native technologies to build and run scalable applications in modern, dynamic environments including public, private, and hybrid clouds. Containers, service meshes, microservices, immutable infrastructure, and declarative APIs exemplify this approach.

These techniques enable resilient, manageable, and observable loosely-coupled systems. They allow developers to make high-impact changes frequently and predictably with minimal effort when combined with robust automation.

The CNCF aims to accelerate adoption of this paradigm by cultivating and maintaining an ecosystem of open source, vendor-neutral initiatives that democratize cutting-edge patterns and make them available to everybody.

Bitrock’s New Membership

Bitrock has recently joined the CNCF community with a Silver Membership: a community that includes the world’s major public cloud and enterprise software firms as well as innovative startups. 

The choice to join this community was based on Bitrock Team's intention to build and influence the cloud native ecosystem alongside industry peers.

By providing governance, thought leadership, and engineering resources to shape and influence the development of the cloud-native ecosystem, Bitrock helps CNCF achieve its mission of making cloud-native computing ubiquitous.

Bitrock also upholds the community's key values as a member - find out more by accessing the CNCF charter:

  • Fast is better than slow 
  • Open 
  • Fair
  • Strong technical identity
  • Clear boundaries 
  • Scalable 
  • Platform agnostic

As a result of this new membership, Bitrock will be able to demonstrate thought leadership in the cloud native space, provide a first-class experience with Kubernetes and other CNCF projects in collaboration with the CNCF and its community, and design and maintain the technologies that are at the forefront of the industry.

Read More