Runbook Chapter #4: Mastering Generative AI with DareData, TensorOps, and NOS

February 5, 2025
1 min read
At Runbook Chapter #4, DareData explored the challenges and strategies of scaling LLMs, alongside experts from TensorOps and NOS, highlighting governance, integration, and real-world AI adoption.

Runbook Chapter #4 brought together AI professionals and business leaders for a deep dive into operationalizing and scaling Large Language Models (LLMs). Hosted at IDEA Spaces, this session focused on the practical challenges and solutions that come with deploying generative AI at scale.

Cláudio Lemos, CEO & Co-Founder at TensorOps. Nuno Brás. Partner and Co-Founder at DareData and Nuno da Rocha Borges, AI Senior Manager at NOS, shared valuable insights on how businesses can effectively integrate LLMs into their workflows while maintaining scalability and governance. The discussion tackled critical topics, including:

Scalability - what it takes to run LLMs efficiently without compromising performance.

Governance and compliance - ensuring AI models are deployed responsibly and securely.

Real-world integration - how companies can seamlessly embed LLMs into their existing processes for maximum impact.

With LLM adoption accelerating, businesses must have the right tools, frameworks, and strategies in place to drive innovation without creating unnecessary complexity. This session reinforced the importance of operationalising AI beyond experimentation, ensuring that companies can harness its full potential while staying compliant and scalable.

A huge thank you to everyone who joined us, shared their expertise, and contributed to the discussion!

Subscribe to newsletter

Subscribe to receive the latest news & posts to your inbox every month.

By subscribing you agree to with our Privacy Policy.
Welcome aboard 🚀

You’re now subscribed to the DareData newsletter.
Keep an eye on your inbox.
Oops! Something went wrong while submitting the form.

Related news