Data-intensive applications Improvement by moving daTA and computation in mixed cloud/fog environmentS

DITAS deals with running computation in the continuum between the Cloud and the Edge, creating tools for running data intensive workloads in a way that the developer and administrator would not be burdened with the details of data source location and the details of the execution environment., IBM's focus is on providing access to data while taking into account privacy regulations like the European General Data Protection Regulation (GDPR). The project aims to demonstrate the technology with (1) an e-health use-case of sharing health-care data through the cloud between medical facilities and for research purposes, and (2) an industry 4.0 case study of analyzing sensor data from manufacturing plants both on the Edge and in the Cloud.

ProTego

Data-protection toolkit reducing risks in hospitals and care centers:

ProTego will provide a toolkit for health care organizations to better assess and reduce cybersecurity risks related to remote devices access to Electronic Health Record data including risks assessment and risks mitigation tools, and methodologies and protocols for prevention and reaction.

The toolkit will provide tools for risk identification and assessment both before and during operation of health care applications, tools to protect sensitive data and devices used to handle it and tools to raise awareness and educate stakeholders in how they can reduce or prevent risks.

CyberKit4SME

CyberKit4SME aims to democratize a kit of cyber security tools and methods enabling SMEs/MEs to: Increase awareness of cybersecurity risks, vulnerabilities and attacks; Monitor and forecast risks; Manage risks using organisational, human and technical security measures with greater confidence; and Collaborate and share information in a collective security and data protection effort. Tools developed in the project are: Semi-automated ISO 27005 threat identification and risk mitigation analysis, using a knowledge base of technical and human/organisational risk factors; Encryption and isolation tools to protect data being stored, processed or exchanged; Security information and event management, using multiple data sources for threat detection and diagnosis, Blockchain tools for SMEs/MEs to share intelligence and incident reports with supply chain partners and with CERTs.

FogProtect

Protecting sensitive data in the computing continuum: FogProtect is a H2020 project aiming to deliver new and advanced architectures, technologies and methodologies to ensure the protection of sensitive data in the computing continuum, from cloud datacentres through fog nodes to end devices.

Contact

Ronen Kat, Manager Cloud Storage, IBM Research - Haifa

 

SLICENET

SliceNet intends to meet the challenging requirements from the management and control planes of network slicing across multiple administrative domains, facilitating early and smooth adoption of 5G slices for verticals to achieve their demanding use cases, and managing the QoE for slice services.

SliceNet will follow a layered architectural approach to allow the creation of a modular, extensible and scalable framework.

SODALITE

SODALITE aims to provide an optimized, highly resilient heterogeneous execution environment enabling operational transparency between Cloud and HPC infrastructures.

UNICORE

UNICORE is creating a common code base and toolkit for deployment of applications to secure and reliable execution environments.

The UNICORE project is developing tools to enable lightweight VM development to be as easy as compiling an app for an existing OS, thus unleashing the use of next generation of cloud computing services and technologies. With UNICORE toolchains for unikernels, software developers will be able to easily build and quickly deploy lightweight virtual machines starting from existing applications.

5GZORRO

5GZORRO will develop these envisaged solutions for zero-touch service, network and security management in multi-stakeholder environments (ubiquitous), making use of Smart contracts based on Distributed Ledgers Technologies to implement required business agility.

Contact

Kathy Barabash, Manager Cloud Architectures and Networking, IBM Research - Haifa

 

5G-MEDIA

The project aims at innovating media-related applications by investigating how these applications and underlying 5G networks should be coupled and interwork to the benefit of both: to ensure the applications allocate the resources they need to deliver high quality of experience and so that the network is not overwhelmed by media traffic.In 5G-MEDIA, IBM leads an effort aiming at integrating open source “serverless” frameworks, such as OpenWhisk into the 5G-MEDIA platform while adapting and evolving them beyond the state of art to deal with challenging latency and throughput requirements of the media centric applications. The resulting impact will be twofold: i) significantly reduce maintenance effort from the platform operator perspective; ii) allow application developers to quickly develop value added code while relieving them from the infrastructure management concerns.

BigDataStack

BigDataStack intends to deliver a complete high-performance data-centric stack of technologies as a unique combined and cross-optimized offering that addresses the emerging needs of data operations and applications.

BigDataStack introduces the paradigm of a new frontrunner data-driven architecture and system ensuring that infrastructure management will be fully efficient and optimized for data operations and data-intensive applications.

CLASS

The CLASS project aims to improve KPIs of various smart-city traffic domains, such as road safety, ease of parking, and traffic manipulation, by applying big data analytics. Data is gathered from street facilities (e.g., cameras) and smart cars sensors, and is channeled through a 5G-enabled edge into a data center/cloud. The CLASS software stack operates across all this 'compute continuum' to deliver insights and actions at both real-time and periodical analytics speeds, operating on both data-in-motion and data-at-rest. IBM Haifa leads the analytics platform work-package and contributes significantly to the edge computation.

CloudButton

Our main goal is to create CloudButton: a Serverless Data Analytics Platform. CloudButton will democratize big data by overly simplifying the overall life cycle and programming model thanks to serverless technologies. To demonstrate the impact of the project, we target two settings with large data volumes: bioinformatics (genomics, metabolomics) and geospatial data (LiDAR, satellital).

PolicyCloud

Making data-driven policy management a reality across Europe: The Policy Cloud project aims to harness the potential of digitisation, big data and cloud technologies to improve the modelling, creation and implementation of policy. In three years (2020-2023) the project will address challenges faced by many businesses and public administrations of improving how they make policy decisions by accessing and using data.

Funded under the European Commission’s H2020 programme, the project will deliver a unique, integrated environment of curated datasets and data management, manipulation, and analysis tools addressing the full lifecycle of policy management in four distinct thematic areas, and using the data analysis capabilities of the European Cloud Initiative.

Contact

Ofer Biran, Manager Cloud and Data Technologies, IBM Research - Haifa

 

Past Projects

Secure Data Processing in the Cloud

RestAssured provides an ecosystem for performing Secure Data Processing in the Cloud, handling access security through policies and data security by keeping data encrypted at all times. IBM's focus is on exploring and creating a more developer-friendly platform for running computations on the data while it is encrypted, leveraging technology's like Intel Software Guard Extensions (SGX).

The project aims to demonstrate its technologies through three diverse use cases: (1) High Performance Computing for commercial enterprises, running memory and compute intensive applications (2) Pay As You Drive usage based insurance, where driving telemetry is used to generate insurance quotes without exposing personal driving telemetry to the insurance provider, and (3) self-directed Social care for vulnerable adults and social care providers, matching between the two in a way which protects individual privacy policies.

MIKELANGELO

The vision of MIKELANGELO project is to make virtual infrastructures ready to run big data, high performance computing, and I/O intensive applications in production.

Cloud and HPC architectures are inefficient. Layers, introduced by legacy and compatibility requirements, result in complexity of software execution, set-up, management and security. MIKELANGELO is targeting the sweet-spot between efficiency, stability and security.

We focus on lowest layers of virtual infrastructure – on virtual IO in the hypervisor (KVM), guest (unikernel – OSv) and between them (virtual RDMA – vRDMA).

OPERA

The aim of OPERA is to create a cooperative, secure, reliable, customized, and low power computing platform that is able to address the challenges imposed by the future convergence of datacentre computing, embedded devices and sensors. To this end OPERA will develop a new generation of high-density servers. These servers are the basic “bricks” for implementing a scalable Low Power datacentre. At the base of these modules there are heterogeneous architectures that use ARM, Intel, POWER8 processors to enable energy efficient server-class processors, and FPGA accelerators for optimized functions and computation offloading (i.e., by instantiating on the reconfigurable device customized circuits that perform specific operation directly in hardware). These devices form a mix of processing elements and accelerators designed for achieving significantly better energy efficiency at the cost of flexibility. Moreover, the integration of the CAPI (Coherent Accelerator Processor Interface) technology – it allows external component such as FPGA-based circuit to have a coherent access to the cache hierarchy of a processors, allows these accelerators appearing to external high-performance systems as if they were cores integrated on the same chip. The OPERA project aims also at exploiting high-speed optical links to provide interconnections between the accelerators and the external system with no performance limitations.

COSIGN

The COSIGN consortium brings together a unique combination of expertise and resources to deliver novel scalable and future-proof intra-data centre network solutions empowered by advanced optical technologies and a software defined control framework, which will overcome existing and predicted bottlenecks of current architectural solutions.

BEACON

The BEACON project set out in 2015 to solve the problem of federated cloud infrastructures. It sought to define and implement a framework. Inter-cloud networking as well as security were lent extra weight as the automated technique for the deployment of applications across different clouds and data-centres required.

IOStack

IOStack is a Software-defined Storage toolkit for Big Data built on top of the OpenStack platform.

IOStack enables efficient execution of virtualized analytics applications over virtualized storage resources thanks to flexible, automated, and low cost data management models based on software-defined storage (SDS).