How much can runtime optimization reduce compute costs and carbon footprint?
In this InTechnology video, Camille talks with Asaf Ezra, CEO of Granulate. They get into what exactly is runtime optimization, how companies like Granulate are making waves in the industry, and predictions for the evolution of hardware and programming.
Redefining Runtime Optimization
Asaf explains to Camille that enhancing application performance doesn’t always require constant updates to the application; instead, modifying the application’s runtime can be effective. He notes that applications have varied requirements, leading to a need for dynamic runtime optimization. This approach not only manages compute resources efficiently to boost application performance but also meets Service Level Agreements (SLAs). According to Asaf’s experience at Granulate, clients often seek to enhance compute performance without necessarily increasing compute resources. He highlights that this strategy not only optimizes performance but also reduces the carbon footprint, making runtime optimization a crucial aspect of sustainability.
Transformative Trends in Runtime Optimization
Asaf discusses major industry advancements in application performance optimization, highlighting Photon as a significant disruptor, primarily due to its C++ foundation, as opposed to Java. Although Photon offers substantial performance improvements, it is relatively expensive. However, Asaf observes that compute costs tend to decrease over time. He also mentions Gluten, an open-source project utilizing Meta’s Velox engine and C++. Asaf anticipates a 25-50% reduction in compute costs in data analytics by 2024 due to these innovations.
Delving into Granulate’s operations, Asaf shares that the company’s solutions include an agent with two modules: a loading mechanism and an optimization module. Post-merger with Intel, he notes Granulate’s integration of Sapphire Rapids accelerators to evaluate CPU and system-on-chip performance. Their solutions, aiding in data analytics, balance CPU usage with memory allocation and assess application performance. These solutions are adaptable for both cloud and on-premise environments.
Predictions for Hardware and Programming Evolution
In their conversation, Camille and Asaf explore future hardware and programming trends. Asaf anticipates a surge in customized hardware to enhance efficiency, paralleling software optimizations, and predicts affordability improvements, citing Intel’s Gaudi as an example. He also foresees generative AI revolutionizing programming by democratizing end-to-end app creation. However, he advises startups to reconsider their operational strategies in this evolving landscape, emphasizing the increasing importance of analytical skills over technical abilities for tech entrepreneurs.
Asaf Ezra, Co-Founder & CEO of Granulate
Asaf Ezra co-founded Granulate, an Intel subsidiary specializing in runtime optimization, and serves as its CEO. He established the company in 2018. Before his tenure at Granulate, Ezra engaged as an Entrepreneur in Residence at YL Ventures and led a research and development team at KayHut. He also dedicated four years to the Israeli Defense Forces in various roles, including Project Manager, R&D Team Leader, and Software Developer. Ezra holds a Bachelor’s degree in Computer Science and Physics from The Hebrew University of Jerusalem.
Check it out. For more information, previous podcasts, and full versions, visit our homepage.
To read more about cybersecurity topics, visit our blog.
#Granulate #runtimeoptimization
The views and opinions expressed are those of the guests and author and do not necessarily reflect the official policy or position of Intel Corporation.
—–
If you are interested in emerging threats, new technologies, or best tips and practices in cybersecurity, please follow the InTechnology podcast on your favorite podcast platforms: Apple Podcast and Spotify.
Follow our hosts Tom Garrison @tommgarrison and Camille @morhardt.
Learn more about Intel Cybersecurity and Intel Compute Life Cycle (CLA).