The IBM Journal has teamed up with IEEE Publications, Editorial Services, to provide a single production workflow for Guest Editors and authors through ScholarOne Manuscripts. The Journal Editor will consider for publication all submissions that relate to any of the currently open Calls for Papers. All submissions will go through the following two-step process:
- Abstract: First, you will submit a preliminary abstract using ScholarOne Manuscripts (≤200 words). The Editor will decide whether to invite you to submit a full paper. Abstracts should include tentative article title, author names/affiliations, and a single descriptive paragraph.
- Full paper: If your abstract is accepted, you will receive an invitation to submit a full paper (also via ScholarOne Manuscripts), which will be peer reviewed. The Guest Editor(s) will make a final decision to publish based on the recommendations of the peer reviewers.
Note: Invitations to submit a full paper will include detailed instructions on next steps and journal requirements. The IBM manuscript preparation template provides an outline of key information for preparing manuscripts and must be used for all full paper submissions.
This issue of the IBM Journal of Research and Development will focus on the use of AI for security, as well as securing AI. We solicit Abstracts (manuscript proposals) on research that describes innovative work at the intersection of AI and security. Potential focus areas include:
- Machine learning for security
- Use of NLP for thread intelligence
- Reasoning in the security domain
- Planning for remediation and countermeasures
- Adversarial AI
- Poisoning attacks
- Evasion attacks
- Defenses against adversarial attacks
- October 8, 2019 Abstracts due
- October 22, 2019 Invitations for full paper sent to select authors
- January 20, 2020 Full manuscripts due
Call for papers - CLOSED
"Summit and Sierra Supercomputers"
Publication date: March 2020
Guest editors: Jaime Moreno and Yoonho Park, IBM Research
IBM delivered two large-scale supercomputer systems to the U.S. Department of Energy (DoE) National Laboratories through the CORAL program (Collaboration of Oak Ridge, Argonne, and Livermore). These systems, called Summit at Oak Ridge National Laboratory (ORNL) and Sierra at Lawrence Livermore National Laboratory (LLNL), simultaneously became the two leaders in the Top500 list in November 2018, something that had never been accomplished by a single vendor. The systems are based on a hybrid architecture comprising IBM POWER9 CPUs, NVIDIA V100 GPUs, and Mellanox Extended Data Rate (EDR) InfiniBand network technologies. The systems consist of more than 4,500 computer nodes each and include major innovations in compute, I/O subsystem, storage, system management, and programmability. Leveraging these capabilities, a set of benchmarks were optimized to demonstrate the strength and scalability of the systems, delivering performance that exceeded the targets originally projected. Moreover, applications ported to these systems are delivering impressive performance, enabling significant advances in the corresponding science and engineering areas. Centers of Excellence were established to help foster the transition of applications into the new architecture, in advance of the availability of the systems, to ensure successful exploitation with minimal delay, including adoption of advanced programming models and software development tools.
This special issue of the IBM Journal of Research and Development invites papers that describe the architecture and technology innovations of the Summit and Sierra systems, the challenges at scale, the exploitation of those innovations by applications, the significant progress in the pursuit of advances in science and engineering enabled by the systems, and the value the systems provide to users. We solicit abstracts (manuscript proposals) in the following and related areas:
- System architecture and implementation
- System management software
- Code optimization and benchmarking
- Programming model innovations
- Experiences from the Centers of Excellence
- June 7, 2019 Abstracts due
- June 14, 2019 Invitations for full paper sent to selected authors
- August 13, 2019 Full papers due
Call for papers - CLOSED
"The Technology, Management, and Analytics of Resilience and Disasters"
Publication date: January 2020, Vol. 64, No. 1
Guest editors: Hendrik Hamann (IBM Research Manager) & Rebecca Curzon (Corporate Citizenship)
The emergence of the Internet of Things (IoT) has created an inexorable growth of live and static data available from myriad sources that allow humans to connect to the world through direct interaction and model-driven prediction. This immersive capability is beginning to affect the measurement of context, performance, and outcomes of human and associated machine and infrastructural behavior during disasters. This improved analysis enables system behavior to be understood and quantified, resources to be optimally applied, and responses to events to be accurately orchestrated.
The same flow of data facilitates a clearer situational awareness for first responders, allowing the provision of a much clearer context of the situation they encounter, and is beginning to provide the analytic tools that allow critical decisions to be made quickly and with confidence in complex disaster-driven situations.
Disaster management probably represents the most complex demands on the accuracy and speed of system prediction and response, which in turn requires an equally accurate understanding of the spatial and temporal relationships between humans, infrastructure, events, and machines.
Improved response outcomes thus rely on technologies that index spatiotemporal and other data, making it possible to ask complex questions such as "Where and what sequence should the flood response team adopt to retrieve the most vulnerable denizens in the most efficient and safe manner while delivering targeted relief supplies in the field?”
This edition of the IBM Journal of Research and Development highlights the information technology approaches that take advantage of the digital transformation that is helping mitigate the effect of natural disasters. We invite papers demonstrating recent advances of the application of advanced IoT technology, data management, analysis, prediction, and associated large data analytic techniques to real-life examples from disaster management. These include migrant prediction, earthquakes, disease proliferation, flooding, and fires to illustrate how this technology evolution has markedly changed the field of disaster response and management. Potential focus areas include:
- Integrating census, GIS, real-time, and other data for detailed situational awareness.
- Combining and analyzing large data sets to provide detailed situational information.
- Using predictive models to help with resource staging, population movement, and recovery.
- Estimating the financial and physical risk of the impact of natural events on human life and infrastructure.
- Novel engineering designs for resilient infrastructures.
- Advanced sensing and data communications.
- Integration and alignment of large data arrays to enabling predictive spatiotemporal analysis.
- Measurement and analysis of social impacts of disasters.
- December 14, 2018 Abstracts due
- February 15, 2019 Invitations for full papers sent to selected authors
- May 17, 2019: Full papers due
- July 19, 2019: Final decisions
Call for papers - CLOSED
"Hardware for Artificial Intelligence"
Publication date: November 2019
Guest editors: Wilfried E. Haensch and Arvind Kumar (IBM Research); and Anand Raghunathan (Purdue University)
Since 2010, the capability of artificial intelligence (AI) has doubled every year. At the center of this exponential growth is the success of Deep Learning, which is spawned by the availability of large labeled datasets and high-throughput, specialized compute resources. AI is making substantial in-roads in business and personal lives and will be ubiquitous within the next decade. To maintain the current trend, advances in algorithms and compute infrastructure are needed.
This issue of the IBM Journal of Research and Development will feature the compute infrastructure that enables today’s progress and will provide an outlook on technologies that will fuel the acceleration and enablement of future workloads. Topics include the role of distributed learning for optimal use of data center resources on current hardware solutions, pushing compute efficiency with reduced precision digital computing for accelerated training and inference for data center and edge applications, and exploitation of heterogeneous integration for 3D packaging for hardware accelerator modules.
We solicit Abstracts (manuscript proposals) on the following potential focus areas:
- Distributed learning on GPU clusters
- Digital accelerators using reduced precision
- Analog technologies for deep learning
- Spiking networks for deep learning
- Hardware adapted algorithms
- Materials for deep learning analog arrays
- Machine intelligence beyond deep learning
- Power and performance benefits for analog deep learning
- Deep learning arrays for irregular data structures
- In-memory processing
- Neuromorphic computation for deep learning: materials and algorithms
- Materials and heterogenous integration
- Brain-inspired computation
- November 2, 2018 Abstracts due
- November 16, 2018 Invitations for full paper sent to select authors
- March 1, 2019 Full manuscripts due
Call for papers - CLOSED
Publication date: July 2019
Guest editors: Sameep Mehta (IBM Research - India), and
Kush R. Varshney and Francesca Rossi (IBM Research - Yorktown Heights)
Recent times have seen unprecedented interested in artificial intelligence (AI) and machine learning research, touching almost all parts of our lives. While the algorithms have proven to be extremely beneficial, there are growing concerns around conscious and unconscious biases embedded in the systems, inability of the algorithms to explain the results, leakage/reverse engineering of private information, and alignment of AI systems to human values. This dichotomy between usefulness of AI vs. AI being perceived as unsafe needs to be bridged.
This special issue of the IBM Journal of Research and Development will focus on approaches to handle such challenges and shape the research direction in this area. Potential focus areas include:
- Comprehensive surveys of computational approaches, legal and policy frameworks, social and moral views for handling biases
- Approaches for detection of different forms of bias in numerical and text data
- Design principles to develop fair algorithms
- Platforms, tools, and techniques for enforcing audits, compliance, and ethics
- Algorithms for general and personalized explainability
- Privacy-preserving AI including learning, releasing, and using AI
- Approaches to collect, curate, and release bias-free datasets
- Industry use cases that display biases or use techniques to handle such biases
- Views on next-generation platform support, AI research, and education needed for safe AI for different personas (developer, data seller, users, etc.)
- Ethics-aware algorithms and frameworks for value alignment, both for data-driven and rule-based symbolic AI approaches
- September 2018 Abstracts due
- October 2018 Invitations for full paper sent to selected authors
Call for papers - CLOSED
"Blockchain: From technology to solutions"
A blockchain is a ledger for recording transactions among multiple participants, maintained by the participants themselves with a distributed protocol. The participants validate that transactions are executed correctly and according to the rules agreed among them.
Blockchains are envisaged to play the role of an arbiter and trusted party in many application domains, where such trust is necessary but not conferred on any single participant. Their uses range from cryptocurrencies and maintaining financial assets to establishing transparency for supply chains and for transportation or to securing identities online.
Recent years have seen explosive growth in research and development of blockchain technology. Driven by the uncontrolled and anarchistic nature of the public and permissionless blockchains that underpin cryptocurrencies, blockchain technology promises to revolutionize the way people and companies perform trusted interactions online. To support this, so-called consortium blockchains for enterprise applications have been introduced. They facilitate applications that respect the traditional organizational forms and legal rules.
This special issue of the IBM Journal of Research and Development will emphasize blockchain technology and solutions. We solicit Abstracts (manuscript proposals) on research that describes innovative work on blockchain systems, theory, applications, and their impact. Potential focus areas include:
- Blockchain platforms
- Blockchain protocols
- Smart contracts
- Cryptographic primitives for blockchains
- Privacy and security
- Architecture for blockchains
- Blockchain solutions
- Application integrations
- Payments and blockchain markets
- Blockchain use cases
- Identity and blockchain
- Digital assets
- Deployment and operation
- May 2018 Abstracts due
- July 2018 Invitations for full paper sent to selected authors
Call for papers - CLOSED
"Advances in computational creativity technology"
Publication date: January 2019
Guest editors: Richard Goodwin, Kush Varshney, and Jinjun Xiong
Computational creativity is the art, science, philosophy, and engineering of building computational systems that demonstrate behaviors that would be deemed creative by unbiased human observers. There has been much recent progress in this field of research, including formalizing what it means for software to be creative and developments in many exciting and valuable applications of creative software in the sciences, the arts, cooking, literature, fashion, and elsewhere.
This special issue of the IBM Journal of Research and Development will emphasize both theoretical contributions to computational creativity research and groundbreaking computational creativity systems in various domains. We solicit Abstracts (manuscript proposals) on research that pushes the field forward in both foundational and applied directions. Potential focus areas include:
- Computational paradigms for understanding creativity as well as metrics, frameworks, formalisms and methodologies for the evaluation of creativity in computational systems.
- Development and assessment of computational systems that support and assist people in creative tasks.
- Applications that address creativity in specific domains such as music, language, narrative, poetry, cooking, fashion, games, visual arts, graphic design, product design, architecture, entertainment, education, mathematical invention, scientific discovery, and programming.
- February 8, 2018 Abstracts due
- February 22, 2018 Invitations for full paper sent to selected authors
- May 2018: Full papers due
- July 2018: Final acceptance decision
Call for papers - CLOSED
"Computational technologies for drug discovery"
Publication date: November 2018
Guest editor: Wendy Cornell, Principal Research Staff Member and Manager
Drug Discovery Technologies and Drug Discovery Strategy Lead
Computer-aided approaches to drug discovery have been in wide use for decades, with ligand similarity methods, QSAR (quantitative structure–activity relationship) approaches, and protein-ligand docking all routinely applied to support lead finding and lead optimization; however, advances in computing power, algorithm development, and information science have expanded the scope and power of computer-aided approaches. For example, molecular dynamics, long used as an exploratory method due to its complexity and computational demands, is now increasingly applied to qualitatively and quantitatively evaluate potential small molecule synthetic targets on a time scale in step with the medicinal chemistry project teams. Deep learning technologies are being leveraged to take advantage of the vast amounts of protein and small molecule data that are generated through basic research as well as successful and unsuccessful drug discovery campaigns. Finally, although much information is captured in structured public databases such as the PDB, ChEMBL, PubChem, or corresponding internal proprietary databases, a significant amount exists only in unstructured format in free text or tables and in these instances text mining is essential.
This special issue of the IBM Journal of Research and Development will emphasize new methods, workflows, and applications, particularly in the fields of deep learning, molecular dynamics, and text mining, but highlighting other techniques as well. Methods, workflows, and applications targeting small molecule, peptide, and biologic drug discovery are all in-scope. We solicit Abstracts (manuscript proposals) in the following and related areas.
- Applications of deep learning to drug discovery
- Applications of molecular dynamics to drug discovery
- Integration of mechanistic modeling and deep learning for drug discovery
- Quantum computing for drug discovery
- Protein or ligand structure searching and analysis for drug discovery
- De novo drug design
- Patent mining for drug discovery
- Literature mining for drug discovery
- Integrated workflows for drug discovery
- Design thinking for drug discovery solution development
- January 9, 2018 Abstracts due
- January 30, 2018 Invitations for full paper sent to selected authors
- April 2018: Full papers due
- June 2018: Final acceptance decision