Paradex
6 days ago
## About Paradex **Paradex** isn’t just another decentralized exchange; it’s the Everything Store for Finance. We’ve combined three powerful financial primitives—Exchange, Asset Management, and Borrow/Lend—all seamlessly composable and accessible through one unified account that uses your entire portfolio as collateral, including spot and derivative assets. Trade 250+ markets on a super exchange with zero fees, better-than-CEX liquidity, and institutional-grade privacy, all built on a blazingly fast blockchain. Backed by top-tier investors and incubated by Paradigm, we’re scaling fast and building a team of mission-driven engineers to redefine what’s possible in decentralized finance. We have no product managers, no managers, no 1:1s at Paradex. No bullshit. Just code. You’ll work directly with our CEO and a small, elite team of builders in the trenches every day, pushing the limits of performance and UX. Fair warning: this is not an easy place to work. We will push you hard. But you’ll grow faster than you thought possible, and you’ll do some of the most meaningful, high-impact work of your life. You’ll own what you ship end to end, tackle real technical challenges (not just UX tweaks), and help shape the future of decentralized finance. Join us on the frontline ⚔️ **Quick Facts** - $200+ Billion Lifetime Volume - $1.2+ Billion Average Daily Volume - $200+ Million Total Value Locked - Paradex is committed to building transparently in public 👉 key platform stats ## Your Mission As a vital member of our data engineering team, you’ll architect and implement robust data infrastructure that transforms raw data into actionable insights. You’ll work on building scalable pipelines and data models that power critical decision-making across the organization. Your expertise will be crucial in establishing data best practices and ensuring data quality, accessibility, and reliability throughout our systems. ## What You’ll Do - **Design & Build:** Create and maintain scalable data pipelines, ETL processes, and data warehousing solutions that handle complex data requirements. - **Optimize & Scale:** Improve data infrastructure performance, implement data quality measures, and develop automated monitoring systems. - **Empower & Enable:** Partner with product, engineering and go-to-market teams to deliver data solutions that drive strategic insights. - **Rapid Response:** Quickly address time-sensitive data needs and ad-hoc requests from stakeholders, turning around critical analyses and data solutions with urgency while maintaining accuracy. ## What You’ll Bring - **Expertise:** 7+ years of data engineering experience - **Data Architecture:** Deep knowledge of data modeling, warehouse design, and ETL best practices. - **Technical Mastery:** Proficiency with modern data stack (Snowflake, Airflow/DBT) and AWS. - **Systems Thinking:** Experience with distributed systems, data streaming (Kafka/Kinesis), and optimization of large-scale data workflows. - **Agility:** Proven track record of efficiently tackling urgent data requests and providing quick solutions to business-critical data needs. ## Your Perks & Benefits - **Competitive Pay:** Top-tier compensation in the industry. - **Generous PTO:** Unlimited vacation. - **Full Benefits:** Comprehensive packages tailored by country. - **Tech Budget:** 3,000 USD for your first-year setup. - **No manager 1:1s and no bullshit meetings.** *When applying, mention the word CANDYSHOP to show you read the job post completely.*
Binance
6 days ago
## Data Scientist (KYB) **Location:** Thailand, Bangkok / Indonesia, Jakarta / Taiwan, Taipei / Vietnam, Ho Chi Minh / Hong Kong / Asia **Department:** Engineering – Big Data **Commitment:** Full-time Onsite or Remote **Workplace Type:** Remote ### About Binance Binance is a leading global blockchain ecosystem behind the world’s largest cryptocurrency exchange by trading volume and registered users. We are trusted by 300+ million people in 100+ countries for our industry-leading security, user fund transparency, trading engine speed, deep liquidity, and an unmatched portfolio of digital-asset products. Binance offerings range from trading and finance to education, research, payments, institutional services, Web3 features, and more. We leverage the power of digital assets and blockchain to build an inclusive financial ecosystem to advance the freedom of money and improve financial access for people around the world. We’re looking for a Data Scientist to take a leading role in developing our KYB (Know Your Business) AI Chatbot, a system that directly improves customer onboarding efficiency and user experience. This position blends cutting‑edge AI model development, iterative experimentation, and real‑world deployment in a fast‑moving fintech environment. You’ll work at the intersection of applied research and engineering — exploring the latest Agentic AI methods, rapidly prototyping ideas, and helping bring an intelligent, conversational onboarding agent into production. ### Responsibilities - Lead the design, training, and implementation of a KYB AI chatbot to automate business identity verification and improve onboarding conversion rates. - Rapidly explore and implement ideas from recent AI and conversational research papers, transforming them into functional prototypes. - Collaborate with engineers and product teams to scale prototypes into robust production systems. - Continuously evaluate and refine chatbot performance through data‑driven feedback loops and user analytics. - Tackle persistent technical challenges with creativity, persistence, and a results‑driven approach. - Communicate progress, findings, and technical insights clearly to both technical and business stakeholders. ### Qualifications - Proven experience in Agentic AI or customer‑facing chatbot development and implementation. - Advanced coding proficiency in Python and SQL, with strong software engineering fundamentals. - Deep understanding of conversational AI workflows, orchestration tools, and backend service integration. - Excellent written and verbal communication skills. - Self‑driven, delivery‑focused mindset with resilience in handling complex technical issues. ### Preferred Qualifications - Systematic knowledge of AI agent evaluation processes and performance benchmarking. - Ability to identify root causes of chatbot issues and rapidly implement iterative improvements. - Familiarity with LLM fine‑tuning, prompt optimization, or retrieval‑augmented generation (RAG). - Exposure to KYB/KYC workflows or onboarding automation systems. ### Why Binance - Shape the future with the world’s leading blockchain ecosystem - Collaborate with world-class talent in a user-centric global organization with a flat structure - Tackle unique, fast-paced projects with autonomy in an innovative environment - Thrive in a results-driven workplace with opportunities for career growth and continuous learning - Competitive salary and company benefits - Work-from-home arrangement (the arrangement may vary depending on the work nature of the business team) Binance is committed to being an equal opportunity employer. We believe that having a diverse workforce is fundamental to our success. *By submitting a job application, you confirm that you have read and agree to our **Candidate Privacy Notice**.* *We may use artificial intelligence (AI) tools to support parts of the hiring process, such as reviewing applications, analyzing resumes, or assessing responses. These tools assist our recruitment team but do not replace human judgment. Final hiring decisions are ultimately made by humans. If you would like more information about how your data is processed, please contact us.* **Apply for this job** When applying, mention the word **CANDYSHOP** to show you read the job post completely.
Symbiotic
9 days ago
## About Symbiotic Symbiotic is the universal staking protocol enabling protocols to streamline decentralizing their stack. Symbiotic enables creating fully custom or templated staking integrations leveraging any asset and features such as slashing, redistribution, (liquid) restaking, as well as native staking. Over 40 teams including Spark, Hyperlane, and Avail are building on Symbiotic’s universal staking primitives secured by over $1bn in TVL. You can find more information about what we're building and how we're doing it here. ## Our People We're a tight-knit team of experienced individuals at the forefront of crypto infrastructure. Our backgrounds range from security auditing and smart contract development to node operation, and we're supported by Pantera Capital, Paradigm and CyberFund. We're on the hunt for talented professionals to join our mission and strengthen our awesome team. This is a fully remote role open to candidates worldwide. We have a preference for candidates in GMT or US timezones to ensure overlap with the core team. **We Will Only Consider Candidates With Web3/crypto Experience.** ## What You'll Do - Analyze on-chain trading activity at scale to understand user behavior, capital flows, liquidity dynamics, and protocol performance across smart contracts and blockchain networks. - Model addresses trading activities and portfolio behavior (positions, fees, PnL, capital efficiency, APR/APY, impermanent losses, etc) using on-chain and market data. - Extract and structure DEX and CEX data, integrating trade-level, liquidity, and pricing data to evaluate execution quality, market structure, and user strategies. - Query and combine internal and external data sources (e.g., Dune, internal warehouses, APIs, exchange data) to build reliable datasets for research and decision-making. - Conduct market-focused research on liquidity design, incentive structures, token emissions, staking models, and competitive positioning across protocols. - Design and build AI-driven research pipelines that integrate on-chain data, market data, and LLMs to automate trading analysis, performance tracking, and recurring business workflows. - Translate complex market data into actionable insights, supporting product decisions around incentives, liquidity growth, risk, and capital allocation. - Collaborate with product, engineering, and business teams to evaluate strategy, optimize capital efficiency, and improve trading-related metrics. ## Who You Are - You have a strong background in financial markets data and/or trading, with solid understanding of portfolio management, liquidity, market structure, and risk. - You have a strong data analysis background and are comfortable working with large, complex trading datasets. - You understand both DEX and CEX mechanics, including AMMs, order books, liquidity provision, incentives, and fee structures. - You are a power user of Dune, able to independently explore blockchain datasets, write and optimize complex SQL queries, and derive market insights. - You’re proficient in SQL and Python, with hands-on experience using pandas (or similar frameworks) for portfolio, PnL, and performance analysis. - You can move from raw trading data to structured insights on capital efficiency, strategy performance, and protocol dynamics. - You work in an AI-native way, leveraging LLMs and modern tools to accelerate research, modeling, and workflow automation. - You’re comfortable operating in a fast-moving, ambiguous environment, taking ownership from hypothesis to final insight. ## Nice to have - Experience in software development, such as data engineering, backend systems, smart contracts (Solidity), or machine learning. - Experience building data pipelines or trading analytics infrastructure. - Hands-on experience interacting with smart contracts or DeFi protocols directly. - Experience collaborating closely with product or trading teams on capital strategy and performance optimization. --- **Attention Job Seeker!** Job scams are unfortunately common in our industry. A few things to note: 1. All interviews are conducted on Google Meet; never on Discord, Zoom, or any other platform. 2. We will always contact you from an @symbiotic.fi email address. 3. If you’re ever in doubt, reach out directly to nick@symbiotic.fi. When applying, mention the word **CANDYSHOP** to show you read the job post completely.
Anchorage Digital
10 days ago
# Member of Compliance, Financial Crimes Compliance Data Analytics **Location:** United States **Department:** Compliance & Risk – FCC Program **Commitment:** Full-Time **Workplace Type:** Remote --- **At Anchorage Digital, we are building the world’s most advanced digital asset platform for institutions to participate in crypto.** Anchorage Digital is a crypto platform that enables institutions to participate in digital assets through custody, staking, trading, governance, settlement, and the industry's leading security infrastructure. Home to Anchorage Digital Bank N.A., the first federally chartered crypto bank in the U.S., Anchorage Digital also serves institutions through Anchorage Digital Singapore, Porto by Anchorage Digital, and other offerings. The company is funded by leading institutions including Andreessen Horowitz, GIC, Goldman Sachs, KKR, and Visa, with its Series D valuation over $3 billion. Founded in 2017 in San Francisco, California, Anchorage Digital has offices in New York, New York; Porto, Portugal; Singapore; and Sioux Falls, South Dakota. Learn more at anchorage.com, on X @Anchorage, and on LinkedIn. We are seeking a highly motivated and intellectually curious Member of Compliance, Financial Crimes Compliance Data Analytics with a strong data analysis background. As a vital member of the Compliance team, you will have the opportunity to support the design, implementation, and optimization of compliance programs across all applicable Anchorage Digital legal entities. You will work closely with various compliance functions, particularly Financial Crimes Compliance, to drive efficiency and effectiveness within the program. Your expertise will be critical in transforming raw data into actionable insights, driving process improvements, and leveraging technology to enhance our overall compliance posture. This role is ideal for a proactive and "young and hungry" technologist who thrives on solving complex problems in a dynamic regulatory environment. You'll gain deep exposure to diverse compliance domains and have the chance to apply your data expertise to strengthen Anchorage Digital's global compliance function through analytics, automation, and the strategic use of AI tools. It is important that you are well-organized, have a strong analytical background, can effectively manage competing priorities, and can adapt to rapid change in a fast-paced environment. If you thrive under uncertainty and are motivated to excel in a dynamic environment with competing priorities, this role is designed for you. Anchorage Digital values individuals who are proactive, detail-oriented, and innovative. --- ## Technical Skills: - Expert in Compliance-related data tables and models, understanding the nuances of the data, the underlying codes, and the limitations of the models. Capable of conducting reliable data analysis independently and accurately. - Work with stakeholders to drive the automation of key compliance processes and workflows using internal tools, such as Know-Your-Customer, sanctions screening, suspicious activity identification and reporting to improve efficiency and reduce manual effort. - Experience experimenting with and deploying AI solutions. Stay current with the latest AI developments and actively seek new ways to enhance efficiency using internal AI tools or creative methods. ## Complexity and Impact of Work: - Support the development and enhancement of BSA/AML models, including transaction monitoring, sanctions screening, customer risk rating, blockchain analytics, and other relevant BSA/AML tools. - Contribute to process improvements and best practices for the FCC Analytics team, including code review, testing frameworks, project management, and documentation. - Conduct ad hoc analyses and respond to time-sensitive data requests from auditors, regulators, or other Compliance teams with accuracy and speed. ## Organizational Knowledge: - Develop deep understanding of Anchorage Digital's business model across custody, staking, stablecoins, and trading, and how data flows through these domains and the compliance models. - Coordinate with cross-functional teams (Product, Engineering, etc.) to implement and improve tools and processes within the broader Compliance department. - Understand how the Compliance and FCC team fits within the broader organizational structure, align work with team and company priorities. ## Communication and Influence: - Manage competing priorities across strategic projects and urgent ad-hoc requests; ask clarifying questions to scope ambiguous requests and push back constructively when requirements are unclear, proposing alternative approaches. - Present findings and data insights with appropriate context, visual aids, and tailored communication. ## You may be a fit for this role if you have: - **Experience:** **2–3 years of experience in a data analytics or data science role**, with a strong understanding of blockchain, cryptocurrency, and the financial services industry. - **Technical Proficiency:** Demonstrated ability to operate autonomously to manage multiple competing priorities. Proficient in data manipulation and analysis tools (e.g., SQL, Looker, advanced Excel/Google Sheets). Advanced SQL skills including CTEs, window functions, complex joins, and query optimization; ability to write production-quality queries for diverse use cases. - **Automation & AI Aptitude:** Experience with or a strong interest in automation principles, leveraging existing AI tools, and exploring new AI tools (such as AI Agents) to enhance productivity. - **Technologist Mindset:** While not necessarily an engineer, a strong understanding of how systems, data flows, and technologies interact is essential. Eagerness to learn and apply new technologies. ## Although not a requirement, bonus points if: - Prior experience in the financial service industry, crypto industry, start-up, or a fast-paced, evolving environment where you've worn multiple hats and adapted quickly. - General understanding of regulatory compliance, financial crimes, and risk management. - Experience with data transformation tools like dbt or similar; familiarity with version control (Git) and software engineering best practices.
10 days ago
## Staff Systems Engineer, Enterprise Data Analytics As a **Staff Systems Engineer, Enterprise Data Analytics**, you will lead the design, development, and optimization of mission-critical integrations across our tech ecosystem. You will work across low-code platforms and custom cloud-native services, leveraging your deep software engineering background. This role requires strong problem-solving skills and the ability to partner with key stakeholders to drive strategic collaboration and shape the future of our enterprise data backbone. ### In this role, you’ll: * Architect, develop, and maintain automated data pipelines using a combination of low-code tools and custom AWS solutions. * Own the provisioning and management of infrastructure for integrations by applying Infrastructure as Code principles with Terraform. * Mentor engineers and champion best practices in software engineering, source control (Gitflow), and DevOps workflows within GitHub repositories. * Collaborate closely with data scientists, product managers, and stakeholders to ensure integrations deliver transformative business value. * Proactively monitor, troubleshoot, and ensure the reliability and observability of integration solutions across distributed systems. * Design and implement robust, scalable integrations between Databricks and key enterprise business systems (Netsuite, Workday, Salesforce). * Design & build integrations and automations using low-code/iPaaS platforms (such as Workato and Fivetran). ### We’re looking for candidates who have: * A Bachelor’s degree in Computer Science, Engineering, or a related field, or equivalent professional experience. * Minimum 8+ years of hands-on experience in software engineering, with expertise in backend or distributed systems in cloud environments. * Advanced proficiency in Python or Node.js. * Expert-level AWS experience: Lambda, Step Functions, S3, CloudWatch, VPCs, IAM. * Deep hands-on expertise with Terraform and modern DevOps practices, including CI/CD and infrastructure lifecycle automation. * Mastery of Git and modern source control workflows (Gitflow), using GitHub. * Strong communication, collaboration, and leadership skills with a history of mentorship and technical ownership. ### Technologies we use and teach: * Python, Node.js * AWS (Lambda, Step Functions, S3, CloudWatch, VPCs, IAM) * Databricks, Airbyte * Terraform * Git, GitHub * Salesforce, Netsuite, Workday, Marketo, Clari, Adaptive * Windsurf ### About Chainalysis Blockchain technology is powering a growing wave of innovation. Businesses and governments around the world are using blockchains to make banking more efficient, connect with their customers, and investigate criminal cases. As adoption of blockchain technology grows, more and more organizations seek access to all this ecosystem has to offer. That’s where Chainalysis comes in. We provide complete knowledge of what’s happening on blockchains through our data, services, and solutions. With Chainalysis, organizations can navigate blockchains safely and with confidence. ### You belong here. At Chainalysis, we believe that diversity of experience and thought makes us stronger. With both customers and employees around the world, we are committed to ensuring our team reflects the unique communities around us. We’re ensuring we keep learning by committing to continually revisit and reevaluate our diversity culture. We encourage applicants across any race, ethnicity, gender/gender expression, age, spirituality, ability, experience and more. If you need any accommodations to make our interview process more accessible to you due to a disability, don't hesitate to let us know. You can learn more here. We can’t wait to meet you. *When applying, mention the word **CANDYSHOP** to show you read the job post completely.*
Hyperbolic Labs
10 days ago
## **Who We Are** Hyperbolic Labs is on a mission to democratize AI by breaking down the barriers to computing power with our Open-Access AI Cloud. By aggregating computing resources across the globe, we offer an innovative GPU marketplace and AI inference service that promise affordability and accessibility for all. As pioneers at the intersection of AI and open-source technology, we believe in an open future where AI innovation is limited only by imagination, not by access to resources. We're looking for forward-thinking individuals who share our passion for making AI universally accessible, secure, and affordable. Join us in building a platform that empowers innovators everywhere to turn their visionary AI projects into reality. As we prepare for growth after our Series A, our team — led by co-founders with PhDs in AI, Math, and Computer Science — is poised to redefine computing. ## **About the Role** We're seeking our first data analytics hire to establish and own the data foundation for our rapidly growing GPU marketplace. This is a high-impact role where you'll work directly with leadership to define, build, and deliver the critical metrics that drive business decisions across the company. You'll bridge the gap between our highly technical AI infrastructure and business needs, transforming complex data into actionable insights. As a data-heavy compute marketplace business, we need someone who can understand the nuances of our marketplace, work independently to produce accurate metrics, and communicate insights clearly to stakeholders. You'll have the opportunity to shape our data analytics function from the ground up, with a clear path to growing into a leadership role as the team expands. ## **Who You Are** - 4-8 years of experience in data analytics, preferably at a technical startup with a small analytics team where you had significant ownership - Strong SQL and Python scripting skills for data manipulation and analysis - Expert at data debugging and ensuring data correctness with a proactive approach to data health - Experience with dashboard development and business intelligence tools (Hex, Metabase, PostHog, or similar) - Exceptional attention to detail with consistent review and improvement of data systems - Strong communication and collaboration skills with the ability to articulate complex data insights to leadership - Experience working with highly technical teams and understanding technical business models - Proactive in seeking clarification when needed and comfortable navigating ambiguity - Ability to accept direct feedback and continuously improve ## **Preferred Qualifications** - Experience in GPU compute infrastructure or cloud computing space - Background in AI infrastructure or related technical domains - Data engineering skills or familiarity with data pipeline development - Experience at high-growth, early stage, technical companies *Hyperbolic is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees.* When applying, mention the word **CANDYSHOP** to show you read the job post completely.
11 days ago
## **Join Tether and Shape the Future of Digital Finance** At Tether, we’re not just building products, we’re pioneering a global financial revolution. Our cutting-edge solutions empower businesses—from exchanges and wallets to payment processors and ATMs—to seamlessly integrate reserve-backed tokens across blockchains. By harnessing the power of blockchain technology, Tether enables you to store, send, and receive digital tokens instantly, securely, and globally, all at a fraction of the cost. Transparency is the bedrock of everything we do, ensuring trust in every transaction. ## **Innovate with Tether** **Tether Finance:** Our innovative product suite features the world’s most trusted stablecoin, **USDT**, relied upon by hundreds of millions worldwide, alongside pioneering digital asset tokenization services. But that’s just the beginning: * **Tether Power:** Driving sustainable growth, our energy solutions optimize excess power for Bitcoin mining using eco-friendly practices in state-of-the-art, geo-diverse facilities. * **Tether Data:** Fueling breakthroughs in AI and peer-to-peer technology, we reduce infrastructure costs and enhance global communications with cutting-edge solutions like **KEET**, our flagship app that redefines secure and private data sharing. * **Tether Education:** Democratizing access to top-tier digital learning, we empower individuals to thrive in the digital and gig economies, driving global growth and opportunity. * **Tether Evolution:** At the intersection of technology and human potential, we are pushing the boundaries of what is possible, crafting a future where innovation and human capabilities merge in powerful, unprecedented ways. ## **Why Join Us?** Our team is a global talent powerhouse, working remotely from every corner of the world. If you’re passionate about making a mark in the fintech space, this is your opportunity to collaborate with some of the brightest minds, pushing boundaries and setting new standards. We’ve grown fast, stayed lean, and secured our place as a leader in the industry. If you have excellent English communication skills and are ready to contribute to the most innovative platform on the planet, Tether is the place for you. **Are you ready to be part of the future?** ## **About the job** We are seeking highly motivated MSc or PhD interns to work on video generation and multimodal video foundation models. Interns will focus on one or more components of the foundation model lifecycle and are encouraged to propose creative, research-driven ideas that advance the state of the art. You will contribute to the development and improvement of open-source video foundation models, analyze their limitations, and design scalable solutions. This is a research-focused internship with opportunities to publish at top-tier computer vision and machine learning conferences, and to work with petabyte-scale video datasets and large distributed GPU clusters with thousands of GPUs. ## **Responsibilities** * Research and improve open-source video and multimodal video generation foundation models * Focus on one or more areas such as pre-training, supervised fine-tuning, post-training, inference, architecture design, or evaluation * Benchmark models against current state-of-the-art, identify bottlenecks, and propose novel improvements * Work with large-scale video datasets and distributed training systems * Collaborate with researchers and engineers on projects with clear research and publication potential ## **Minimum Qualifications** * MSc or PhD candidate in Computer Science, Machine Learning, Computer Vision, or a related technical field * Research topic or experience in image generation, video generation, or multimodal learning * Awareness of open-source video foundation models and their current limitations * Proficiency with PyTorch and modern deep learning workflows * Strong analytical thinking, creativity, and collaboration skills * Prior first-author related publications in CVPR, ICCV, ECCV, NeurIPS, or ICLR ## **Preferred Qualifications** * Demonstrated related work, such as research codebase or benchmarks released on GitHub or similar platforms * Experience with large-scale or distributed training * Hands-on experience with diffusion-based, transformer-based, or hybrid video generation models ## **Important information for candidates** Recruitment scams have become increasingly common. To protect yourself, please keep the following in mind when applying for roles: 1. **Apply only through our official channels.** We do not use third-party platforms or agencies for recruitment unless clearly stated. All open roles are listed on our official careers page: https://tether.recruitee.com/ 2. **Verify the recruiter’s identity.** All our recruiters have verified LinkedIn profiles. If you’re unsure, you can confirm their identity by checking their profile or contacting us through our website. 3. **Be cautious of unusual communication methods.** We do not conduct interviews over WhatsApp, Telegram, or SMS. All communication is done through official company emails and platforms. 4. **Double-check email addresses.** All communication from us will come from emails ending in **@tether.to** or **@tether.io** 5. **We will never request payment or financial details.** If someone asks for personal financial information or payment at any point during the hiring process, it is a scam. Please report it immediately. **When in doubt, feel free to reach out through our official website.** When applying, mention the word **CANDYSHOP** to show you read the job post completely.
11 days ago
## **Join Tether and Shape the Future of Digital Finance** At Tether, we’re not just building products, we’re pioneering a global financial revolution. Our cutting-edge solutions empower businesses—from exchanges and wallets to payment processors and ATMs—to seamlessly integrate reserve-backed tokens across blockchains. By harnessing the power of blockchain technology, Tether enables you to store, send, and receive digital tokens instantly, securely, and globally, all at a fraction of the cost. Transparency is the bedrock of everything we do, ensuring trust in every transaction. ## **Innovate with Tether** * **Tether Finance:** Our innovative product suite features the world’s most trusted stablecoin, **USDT**, relied upon by hundreds of millions worldwide, alongside pioneering digital asset tokenization services. But that’s just the beginning: * **Tether Power:** Driving sustainable growth, our energy solutions optimize excess power for Bitcoin mining using eco-friendly practices in state-of-the-art, geo-diverse facilities. * **Tether Data:** Fueling breakthroughs in AI and peer-to-peer technology, we reduce infrastructure costs and enhance global communications with cutting-edge solutions like **KEET**, our flagship app that redefines secure and private data sharing. * **Tether Education:** Democratizing access to top-tier digital learning, we empower individuals to thrive in the digital and gig economies, driving global growth and opportunity. * **Tether Evolution:** At the intersection of technology and human potential, we are pushing the boundaries of what is possible, crafting a future where innovation and human capabilities merge in powerful, unprecedented ways. ## **Why Join Us?** Our team is a global talent powerhouse, working remotely from every corner of the world. If you’re passionate about making a mark in the fintech space, this is your opportunity to collaborate with some of the brightest minds, pushing boundaries and setting new standards. We’ve grown fast, stayed lean, and secured our place as a leader in the industry. If you have excellent English communication skills and are ready to contribute to the most innovative platform on the planet, Tether is the place for you. **Are you ready to be part of the future?** ## **About the job** We are seeking highly motivated MSc or PhD interns to work on video generation and multimodal video foundation models. Interns will focus on one or more components of the foundation model lifecycle and are encouraged to propose creative, research-driven ideas that advance the state of the art. You will contribute to the development and improvement of open-source video foundation models, analyze their limitations, and design scalable solutions. This is a research-focused internship with opportunities to publish at top-tier computer vision and machine learning conferences, and to work with petabyte-scale video datasets and large distributed GPU clusters with thousands of GPUs. ## **Responsibilities** * Research and improve open-source video and multimodal video generation foundation models * Focus on one or more areas such as pre-training, supervised fine-tuning, post-training, inference, architecture design, or evaluation * Benchmark models against current state-of-the-art, identify bottlenecks, and propose novel improvements * Work with large-scale video datasets and distributed training systems * Collaborate with researchers and engineers on projects with clear research and publication potential ## **Minimum Qualifications** * MSc or PhD candidate in Computer Science, Machine Learning, Computer Vision, or a related technical field * Research topic or experience in image generation, video generation, or multimodal learning * Awareness of open-source video foundation models and their current limitations * Proficiency with PyTorch and modern deep learning workflows * Strong analytical thinking, creativity, and collaboration skills * Prior first-author related publications in CVPR, ICCV, ECCV, NeurIPS, or ICLR ## **Preferred Qualifications** * Demonstrated related work, such as research codebase or benchmarks released on GitHub or similar platforms * Experience with large-scale or distributed training * Hands-on experience with diffusion-based, transformer-based, or hybrid video generation models ## **Important information for candidates** Recruitment scams have become increasingly common. To protect yourself, please keep the following in mind when applying for roles: 1. **Apply only through our official channels.** We do not use third-party platforms or agencies for recruitment unless clearly stated. All open roles are listed on our official careers page: https://tether.recruitee.com/ 2. **Verify the recruiter’s identity.** All our recruiters have verified LinkedIn profiles. If you’re unsure, you can confirm their identity by checking their profile or contacting us through our website. 3. **Be cautious of unusual communication methods.** We do not conduct interviews over WhatsApp, Telegram, or SMS. All communication is done through official company emails and platforms. 4. **Double-check email addresses.** All communication from us will come from emails ending in **@tether.to** or **@tether.io** 5. **We will never request payment or financial details.** If someone asks for personal financial information or payment at any point during the hiring process, it is a scam. Please report it immediately. **When in doubt, feel free to reach out through our official website.** When applying, mention the word **CANDYSHOP** to show you read the job post completely.
15 days ago
## About Andreessen Horowitz (a16z) Founded in Silicon Valley in 2009 by Marc Andreessen and Ben Horowitz, Andreessen Horowitz (aka a16z) is a venture capital firm that backs bold entrepreneurs building the future through technology. We are stage agnostic. We invest in seed to venture to growth-stage technology companies, across AI, bio + healthcare, consumer, crypto, enterprise, fintech, games, and companies building toward American dynamism. a16z has $90B under management across multiple funds. We’ve established a team that is defined by respect for the entrepreneur and the company-building process; we know what it’s like to be in the founder’s shoes. We’ve invested in companies like Anduril, Airbnb, Coinbase, Cursor, Databricks, Deel, Figma, GitHub, Roblox, SpaceX, and Stripe. Our team is at the forefront of new technology, helping founders and their companies impact and change the world. ## The Role The **Fund Strategy Engineering & Data Partner** plays a key role in delivering actionable, data-driven insights across all of a16z. This role will leverage Databricks, SQL, and modern software, data, & ML frameworks to construct enterprise-grade datasets, simulations, and analytical tools. The role involves designing automated data pipelines, cleaning and enriching large structured and unstructured datasets, applying advanced statistical and machine learning methods, and developing programmable analyses that address complex business challenges. This role requires strong technical expertise in software engineering, scalable data systems, and applied analytics, paired with a deep understanding of fund mechanics and investment strategy. The Partner will collaborate with central teams such as the software group, investor relations, finance, compliance, and tax to transform complex financial datasets into predictive, decision-grade intelligence. The Fund Strategy team focuses on four core areas: * **New Fund Formation Modeling and Strategy** – Developing models to forecast fund dynamics * **Fund Management** – Managing deployment pacing, portfolio construction, portfolio strategy, and life-of-fund responsibilities * **Capital Management** – Optimizing capital allocation and financial planning * **Cash and Stock Distributions** – Strategizing on distribution methods to maximize returns This position demands proficiency in both the technical aspects of software and data engineering and the strategic understanding of fund management, providing a bridge between data-driven insights and executive decision-making. Must thrive at the intersection of technical execution and business impact. This role requires an in-office presence 3 days a week in our San Francisco, CA office or Menlo Park, CA office. ## To join our team, you should be excited to: * Architect and build automated systems, internal tools, and web applications that collect, classify, and analyze financial data across structured and unstructured sources. Work will be done in partnership with the firm’s software group. * Own the end-to-end analytics lifecycle—from exploratory data analysis and hypothesis generation to model development, validation, and deployment. * Guide the technical direction of data and AI initiatives, including architecture, roadmap priorities, and tool selection—applying first-principles thinking to fund analytics. * Lead development of analytical and predictive systems for investment tracking, portfolio optimization, and fund-level strategy. * Design and scale modern data pipelines and ML workflows using Fivetran, dbt, Databricks, and Hex—ensuring performant integration of financial data across cloud systems. * Translate complex datasets into decision-grade insights by applying statistical inference, predictive modeling, and causal analysis for fund pacing, reserve management, and risk assessment. * Deliver real-time intelligence via dashboards, APIs, and simulations—enabling faster, more consistent decision-making across investment and operating teams. * Support pro-rata and follow-on investment analysis with scenario modeling and optimization frameworks. * Conduct public markets analytics using econometrics, natural language processing, and quantitative modeling to inform distribution and exit strategies. * Build and maintain positive relationships and act as a trusted technical and engineering partner with key groups of the firm including the investment, legal, capital network, and finance teams. * Own the design and delivery of scalable, executive-facing dashboards that integrate data from finance, ops, and engineering to surface insights on fund performance and investment outcomes for senior stakeholders. * Support ad-hoc projects related to the Firm’s priorities and initiatives. ## Minimum Qualifications * Mastery of modern data and software engineering tools and languages, including SQL, Databricks, dbt, Delta Lake, Python, and orchestration or web frameworks. * 7+ years of experience building scalable, data-driven software systems or internal tools that support financial decision-making, asset management, or portfolio strategy. * Degree in Computer Science, Engineering, Data Science, Applied Math/Stats, Finance or similar. * Deep domain fluency in venture economics, fund modeling, tokenomics, pro rata mechanics, and carry waterfalls. * Proven experience delivering analytical platforms and predictive models that inform executive decision-making. * Strong collaborative skills and the ability to work cross-functionally across the various operational groups within a16z. * Experience leading projects with company-wide impact. * Experience in mentoring/influencing senior engineers across organizations. * Proven track record of planning multi-year roadmaps in which short-term projects ladder up to a long-term vision. * Experience in driving large cross-functional/industry-wide engineering efforts. * Experience building internal platforms, APIs, or user-facing tools is a plus. * Candidate must be authorized to work in and be living in the United States. * Low ego, high empathy, and the capacity to collaborate effectively with diverse teams. ## Compensation & Benefits The anticipated salary range for this role is between **$275,000 - $320,000**. Actual starting pay may vary based on a range of factors which can include experience, skills, and scope. This role is eligible to participate in the a16z carry program and various discretionary bonus programs as well as benefit and perquisite plans including health, dental, vision, disability, life insurance, 401K plan, vacation, and sick leave. ## a16z Culture * We do only first class business and only in a first class way * We take a long view of relationships, because we are in the relationship business * We believe in the future and bet the firm that way * We are all different, we recognize that, and we win * We celebrate the good times * We do it for the team * We play to win At a16z we are always looking to hire the absolute best talent and recognize that diversity in our experiences and backgrounds is what makes us stronger. We hire candidates of any race, color, ancestry, religion, sex, national origin, sexual orientation, gender identity, age, marital or family status, disability, Veteran status, and any other status. These differences are what enables us to work towards the future we envision for ourselves, our portfolio companies, and the World. Our organization participates in E-Verify. Click here to learn about E-Verify.
Odiin.
15 days ago
## Job Description You will work with blockchain data, dashboards, and analytics tools to inform strategy and product development. ### Responsibilities * Collect, process, and analyze on-chain data and transaction history. * Build dashboards and visualizations to track blockchain activity and KPIs. * Identify trends, anomalies, and patterns in decentralized networks. * Collaborate with product, engineering, and research teams to inform decisions. * Prepare reports and presentations for stakeholders. ### Requirements * Experience with blockchain data analytics tools (The Graph, Dune Analytics, Nansen). * Strong SQL, Python, or data analysis skills. * Knowledge of blockchain concepts, protocols, and tokenomics. * Ability to interpret and visualize complex datasets. * Strong problem-solving and communication skills. When applying, mention the word **CANDYSHOP** to show you read the job post completely.