Responsible for the project
Gemma has been designed with ours AI rules At the forefront. As part of making GEMMA models pre -trained safe and reliable models, we used automated techniques to filter some personal data and other sensitive data from training sets. In addition, we used extensive refinement and reinforcing learning from human feedback (RLHF) to adapt our models adapted to the instructions with responsible behaviors. To understand and reduce the risk profile for Gemma models, we conducted solid ratings, including hand red teams, automated opposite tests and assessing models for dangerous activities. These assessments are presented in ours Model card.
We also spend a new one Responsible AI tool set Together with Gemma to help programmers and researchers in the priority of building safe and responsible AI applications. The tools include:
- Safety classification: We ensure innovative methodology to build solid safety classifiers with minimal examples.
- Debugging: Model debugging tool It helps to examine the behavior of Gemma and solve potential problems.
- Conductivity: You can access the best practices for model builders based on Google experience in developing and implementing large language models.
Optimized as part of frameworks, tools and equipment
You can tune Gemma models on your own data to adapt to specific applications' needs, such as summary or generation of download (RAG). Gemma supports a wide range of tools and systems:
- Multi -fry tools: Bring your favorite framework, with reference implementation regarding the inference and tuning in many Keras 3.0 frames, native pythorch, jax and hulging face transformers.
- Compatibility on the device: Gemma models flow to popular types of devices, including a laptop, desktop computer, IoT, Mobile and Cloud, enabling widely available artificial intelligence possibilities.
- The most modern hardware platforms: We have Works with NVIDIA to optimize Gemma for NVIDIA GPUFrom the data center to the cloud to local RTX AI computers, providing the leading performance and integration in the industry with the latest technology.
- Optimized for Google Cloud: Vertex AI provides a wide MLOPS tool set with a number of tuning options and implementing one click using built -in application optimization. Advanced adaptation is available full of managed Vertex AI tools or with independent GKE, including implementation for profitable infrastructure in GPU, TPU and CPU from any platform.
Free loans for research and development
Gemma is built for the open community of programmers and researchers supplying AI innovation. You can start working with Gemma today using free access to Kaggle, free Colab notebooks and $ 300 in loans for Google Cloud. Scientists can also apply for Google cloud loans For a collective USD 500,000 to speed up your projects.
Starting work
You can learn more about GEMMA and access QuickStart guides ai.google.dev/gemma.
As the Gemma Model family develops, we are waiting for new variants for various applications. Stay with events and possibilities in the coming weeks to connect, learn and build with Gemma.
We are excited to see what you create!