Skip to content

Incorporate Posit Support #57724

@Afonso-2403

Description

@Afonso-2403

🚀 Feature

Incorporate support for Posit number format as another dtype of PyTorch

Motivation

The Posit number format (introduced in Beating Floating Point at its Own Game) has been sparking a lot of interest in the research community lately, in particular for its potential on low precision training and inference of Neural Networks.
Researchers have been building their own frameworks for deep learning on top of Posit, some based on PyTorch (such as PositNN), others on TensorFlow (such as Deep PeNSieve).
However, it would be much nicer to have PyTorch support Posit, so that it could be used directly for research without the need to build a framework from scratch.

Ideas For Implementation

Following Issue#52673 and after some discussion with @RaulMurillo, I believe that the best way to do this is by having different Posit types be supported as another dtype, backed by an external library such as Universal and register kernel implementations to the dispatcher for each method.
Issue#755 and Issue#33152 hold a history of the process that complex numbers went through until they became supported, and I was thinking that a similar process could be undergone for Posits.
Since I am quite inexperienced in PyTorch and its internals, I would love to have some insights on this suggestion: if it makes sense to go this way, if there are obstacles that I should be aware of, etc.
@albanD @ezyang (and other people interested), is this a good way forward?

Metadata

Metadata

Assignees

No one assigned

    Labels

    needs researchWe need to decide whether or not this merits inclusion, based on research worldtriagedThis issue has been looked at a team member, and triaged and prioritized into an appropriate module

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions