Databases

ScyllaDB Website Marketing: “ScyllaDB’s close-to-the-metal architecture handles millions of OPS with predictable single-digit millisecond latencies.” Wide Column vs Document Sources: https://lp.scylladb.com/scylladb-mongodb-comparison.pdf https://www.scylladb.com/glossary/wide-column-database/ Data Model: “MongoDB applies a JSON based document data model, ScyllaDB a wide-column data model.” Apache Cassandra Marketing: “Manage massive amounts of data, fast, without losing sleep” MongoDB Marketing: “You don’t need a separate database to support transactions, rich search, or genAI. The world’s most popular document database is now the world’s most versatile developer data platform.

Foundation Models

A type of ML model that is trained on a broad dataset. These models are generally used for general or broad use cases, and may be adaptable through fine-tuning. Foundation models like the ones powering OpenAI’s ChatGPT, can cost hundreds of millions of dollars to train. The term was coined in Aug. 2021 at Stanford. —— Foundation models are near universally based on Transformers.

Mamba Model

Why? Foundation Models Linear Attention Gated Convolution Recurrent Models

Staff Engineering Archetypes

Source Tech Lead Focus on partnering with a small amount of Engineering Managers (1-3 generally) Guides approach and execution of technical bits, generally in a focused area Tech Lead Manager may also exists, but is on the Engineering Manager track (not Staff-plus) Architect Combines technical prowess and organizational leadership knowledge to guide direction, quality, and execution in critical paths Solver Digs very deep into arbitrarily difficult or complex problem areas Very strong technical skills are necessary Can focus on a specific area for an extended amount of time, or can jump between “hotspots” Right Hand Generally only present in very large organizations Acts as an extension to an executive, providing additional processing/bandwidth Generally acts on the scope and authority of whom they’re assisting

Transformers

What is it? A deep learning architecture which is based on multi-head attention. Transformers were introduced to the world through a 2017 paper by eight scientists at Google: “Attention Is All You Need”. A paper which is seen as the turning point of modern artificial intelligence. Why was it created? Previously, ML architectures such as recurrent architectures, long short-term memory took much longer to train. Transformers enabled more efficient training, and by proxy made possible the wave of LLMs (Large Language Models) we have access to today.