Release of the Final DICE Framework

 Uncategorized  Comments Off on Release of the Final DICE Framework
Mar 222018

After a 36-months R&D collaboration, the DICE consortium is pleased to announce the final release of the open source DICE framework and its two commercial versions DICE Velocity and DICE BatchPro.

DICE delivers innovative development methods and tools to strengthen the competitiveness of small and medium European ISVs in the market of business-critical data-intensive applications. The barriers that DICE breaks are the shortage of methods to express data-aware quality requirements in model-driven development and the ability to consistently consider these requirements throughout DevOps tool-chains during quality analysis, testing, and deployment of the application. Existing methodologies and tools provide these capabilities for traditional enterprise software systems and cloud-based applications, but when it comes to increasingly popular technologies such as, e.g., Hadoop/MapReduce, Spark, Storm, or Cassandra, it was difficult before DICE to adopt a holistic quality-driven software engineering approach. DICE delivers this capability, providing a quality-driven development environment for data-intensive applications.

In particular, DICE offers a DevOps methodology and platform covering multiple aspects of the lifecycle of a Big data application. A collection of 14 tools has been created and released as open source. The tools can guide in the definition of new Big data applications or in extending existing ones. A knowledge repository has been created to help end users to explore the different features of the tools, as well as to navigate through supporting tutorials and videos.

In particular, the open source release of the DICE framework is available free of charge and offers to development and operations teams:

  • An Eclipse-based IDE implementing the DICE DevOps methodology and guiding the user step-by-step through the use of cheatsheets
  • A new UML profile to design data-intensive applications taking into account quality-of-service requirements and featuring privacy-by-design methods
  • Quality analysis tools to simulate, verify, and optimize the application design and identify possible anti-patterns
  • OASIS TOSCA-compliant deployment and orchestration on cloud VMs and containers
  • Monitoring and anomaly detection tools based on the Elasticsearch-Logstash-Kibana stack
  • Runtime methods for configuration optimization, testing and fault injection
  • Native support for open-source Apache platforms such as Storm, Spark, Hadoop, and Cassandra.

The DICE framework is also available in two commercial versions focused on real-time applications (DICE Velocity) and batch processing system development and delivery (DICE BatchPro).

The DICE tools have been presented and are actively downloaded by a diverse group of stakeholders. Videos that illustrate cross-cutting benefits of the solution for different needs and use case scenarios are available on the DICE YouTube channel, together with tutorials on the DICE blog, as well as regular announcements on the DICE Twitter newsfeed.

Deploying the DICE Simulation Tool in the News Orchestrator DIA

 Uncategorized  Comments Off on Deploying the DICE Simulation Tool in the News Orchestrator DIA
Jan 162018

Scalability, bottleneck detection and simulation/predictive analysis are some of the core requirements for the News Orchestrator DIA. The DICE Simulation tool promises that it can perform a performance assessment of a Storm based DIA that would allow the prediction of the behaviour of the system prior to the deployment on a production cloud environment. The News Orchestrator engineers are often spending much time and effort in order to configure and adapt the topology configuration according to the target runtime execution context. Introducing a tool that can perform such a demanding task efficiently would clearly increase the developer’s productivity and also facilitate their testing needs.

Continue reading »

DevOps: A Quality Assessment Experience

 Uncategorized  Comments Off on DevOps: A Quality Assessment Experience
Dec 132017

In a previous article of this Blog, we discussed the importance of assessing quality during the development of data intensive applications (DIA). In particular, we explored the performance and reliability properties of DIA and presented a Simulation tool (SimTool) that helps on this purpose. This article extends such contribution, concretely for addressing the quality topic in the DevOps context. The core idea of DevOps is to foster a close cooperation between the Dev and Ops teams. Probably, the reader will also be interested on taking a look of what the DICE project proposes at this regard.

Continue reading »

Detecting Anomalies during App Development

 Uncategorized  Comments Off on Detecting Anomalies during App Development
Oct 312017

During the development phase of a Data Intensive Application (DIA) using Big data frameworks (such as Storm, Spark, etc.) developers have to contend with not only developing their application but also with the underlying platforms. During the initial stages of development bugs and performance issues are almost unavoidable and most of the time hard to debug using only the monitoring data. The anomaly detection platform is geared towards automatically checking for performance related contextual anomalies.

Continue reading »

5 Reasons to Use Fault Injection in DevOps

 Uncategorized  Comments Off on 5 Reasons to Use Fault Injection in DevOps
Oct 312017

Bringing development and IT operations together can help address many application deployment challenges. To address areas of quality these challenges require a toolset to manage and measure performance and improve reliability. There is a need for not only resilient platforms but also robustness of the data intensive applications that run inside them.

A Fault Inject Tool (FIT) is part of that solution and one way to provoke scenarios which will emphasise the impact of sometimes otherwise hidden issues. The FIT enables controlled causing of cloud platform issues such as resource stress and service or VM outages, the purpose being to observe the subsequent effect on deployed applications.

The FIT is being designed for use in a DevOps workflow for tighter correlation between application design and cloud operation, although not limited to this usage, and helps improve resiliency for data intensive applications by bringing together fault tolerance, stress testing and benchmarking in a single tool. Here are 5 compelling reasons why a FIT tool is useful for the developers of data intensive applications.

Continue reading »

JMT Petri Net Extension for Performance Analysis of Big Data Applications

 Uncategorized  Comments Off on JMT Petri Net Extension for Performance Analysis of Big Data Applications
Sep 102017

JMT (Java Modelling Tools) is an integrated environment for performance evaluation, capacity planning and workload characterization of computer and communication systems [1]. A number of cutting-edge algorithms are available for exact, approximate and asymptotic analysis of queueing networks (QNs), with either product-form or non-product-form solutions. Users can define and solve models through a well-designed graphical interface, or optionally an alphanumeric wizard. Released under GPLv2, JMT benefits a large community of thousands of students, researchers and practitioners, with more than 5,000 downloads per year.

Continue reading »

“Gazing” the Clouds: Cloud Applications Monitoring, and what’s going on in industry…

 Uncategorized  Comments Off on “Gazing” the Clouds: Cloud Applications Monitoring, and what’s going on in industry…
Sep 042017

The advent of cloud computing triggered a huge change in software release cycles for an increasing number of companies embracing cloud technologies as the 21st century’s technological utility… Where once your company invested in large, upfront investments in physical servers, that same strategy is increasingly being replaced by on-demand and pay-per-use cloud access – at the same time, complex manual deployment procedures are increasingly being automated in the context of DevOps and connected technologies… What is the organizational and technical consequence of these phenomena?

Continue reading »

Release 0.3.4 of DICE Deployment Service

 Uncategorized  Comments Off on Release 0.3.4 of DICE Deployment Service
Jun 192017

We are happy to announce the release 0.3.4 of our DICE Deployment Service and version 0.7.0 of the DICE TOSCA technology library. With these components, we aim to remove one big hurdle on the path to the world of Big Data: setting the components up and wiring them to have all the parts play along nicely. We also want to enable the users to easily run their application in a number of private and public clouds without any worry of being locked into a particular one. This release introduces a unified approach to deploying blueprints to OpenStack, Amazon EC2 or Flexiant Cloud Orchestrator without needing to change anything in the blueprint.

Continue reading »

Rich Client Platform for the DIA-integrated Development

 Uncategorized  Comments Off on Rich Client Platform for the DIA-integrated Development
Mar 072017

DICE focuses on the quality assurance for data-intensive applications (DIA) developed through the Model-Driven Engineering (MDE) paradigm. The project aims at delivering methods and tools that will help satisfying quality requirements in data-intensive applications by iterative enhancement of their architecture design. One component of the tool chain developed within the project is the DICE IDE. It is an Integrated Development Environment (IDE) that accelerates the development of data-intensive applications.

The Eclipse-based DICE IDE integrates most of the tools of the DICE framework and it is the base of the DICE methodology. As highlighted in the deliverable D1.1 State of the Art Analysis, there does not exist yet any MDE IDE on the software market through which a designer can create models to describe and analyse data-intensive or Big Data applications and their underpinning technology stack. This is the motivation for defining the DICE IDE.

The DICE IDE is based on Eclipse, which is the de-facto standard for the creation of software engineering models based on the MDE approach. DICE customizes the Eclipse IDE with suitable plug-ins that integrate the execution of the different DICE tools, in order to minimize learning curves and simplify adoption. In this blog post we explain how the DICE tools introduced to the reader earlier have been integrated into the IDE. So, How’s the DICE IDE built?

Continue reading »

Apache Cassandra: From Design to Deployment

 Uncategorized  Comments Off on Apache Cassandra: From Design to Deployment
Feb 022017

In spite of its young age, the Big Data ecosystem already contains a plethora of complex and diverse open source frameworks. They are commonly of two kinds: data platform frameworks, which deal with the needed storage scalability, or processing frameworks, which aim to improve query performance [1]. A Big Data application is generally produced by combining them in a smooth way. Each framework operates with its own computational model. For example, a data platform framework may manage distributed files, tuples, or graphs, and a processing framework may handle batch or real-time jobs. Building a reliable and robust Data-Intensive Application (DIA) consists in finding a suitable combination that meet requirements. Besides, without a careful design by developers on the one hand, and an optimal configuration of frameworks by operators on the other hand, the quality of the DIA cannot be guaranteed.

In this blog post we would like to mention three simple principles we have learned while we were building our Big Data application:

  1. Using models to synchronize the work of developers and operators;
  2. Designing databases so that we do not need to update or delete data; and
  3. Letting operators resolve low-level production-specific issues.

Continue reading »