Please Share With Your Friends

Google released a new open-source artificial intelligence (AI) model that can run locally on just 2GB of RAM. Dubbed Gemma 3n, it is built on the company’s new MatFormer architecture and uses Per-Layer Embeddings (PLE) to smartly manage storage space while remaining efficient. Notably, the model accepts multimodal input, but can only generate text output.


Please Share With Your Friends
See also  Google Pixel 10 Series to Lack In-Built Magnets Despite Offering Qi 2 Support: Report

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *