quantlib.definition package

Submodules

quantlib.definition.bar module

class quantlib.definition.bar.Bar(*, time: datetime, index: int, open: float, close: float, high: float, low: float)

Bases: BaseModel

close: float
high: float
high_low_by_linemode(line_mode: LineMode) float

Get bar’s high/low value by line mode. This method is designed to reduce duplicate code on checking line mode in the main logic. We check line mode and handle error here and return this bar’s high/low accordingly

Parameters:

line_mode (LineMode) – line mode: RESISTANCE or SUPPORT

Raises:

ValueError – Unexpected: Invalid Line Mode (this theoretically should never happen)

Returns:

bar’s high/low

Return type:

float

index: int
low: float
model_config: ClassVar[ConfigDict] = {}

Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].

model_fields: ClassVar[dict[str, FieldInfo]] = {'close': FieldInfo(annotation=float, required=True, metadata=[Ge(ge=0)]), 'high': FieldInfo(annotation=float, required=True, metadata=[Ge(ge=0)]), 'index': FieldInfo(annotation=int, required=True, metadata=[Ge(ge=0)]), 'low': FieldInfo(annotation=float, required=True, metadata=[Ge(ge=0)]), 'open': FieldInfo(annotation=float, required=True, metadata=[Ge(ge=0)]), 'time': FieldInfo(annotation=datetime, required=True)}

Metadata about the fields defined on the model, mapping of field names to [FieldInfo][pydantic.fields.FieldInfo].

This replaces Model.__fields__ from Pydantic V1.

open: float
time: datetime
to_json() dict
class quantlib.definition.bar.BarHistory(*, bars: list[Bar])

Bases: Table

bars: list[Bar]
property close: Series
classmethod from_df(df: DataFrame) BarHistory

Concert a DataFrame to BarHistory, preprocess dataloading with DataPreprocessor if necessary

Parameters:

df (pd.DataFrame) – DataFrame with columns: (time, open, close, high, low), time should be in ISO

Returns:

an object of BarHistory

Return type:

BarHistory

property high: Series
inferred_time_interval() TimeInterval

Automatically infer the time interval

Returns:

Time Interval Category

Return type:

TimeInterval

property low: Series
model_config: ClassVar[ConfigDict] = {}

Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].

model_fields: ClassVar[dict[str, FieldInfo]] = {'bars': FieldInfo(annotation=list[Bar], required=True)}

Metadata about the fields defined on the model, mapping of field names to [FieldInfo][pydantic.fields.FieldInfo].

This replaces Model.__fields__ from Pydantic V1.

model_post_init(_ModelMetaclass__context: Any) None

We need to both initialize private attributes and call the user-defined model_post_init method.

property open: Series
property time: Series

quantlib.definition.config module

class quantlib.definition.config.SimpleAnalyzerConfig(lower_search_range: int, upper_search_range: int, selection_method: quantlib.constant.enum.RSSPSelectionMethod, allow_negative_support: bool, allow_positive_resistance: bool, optimal_slope_mode: quantlib.constant.enum.OptimalSlopeMode)

Bases: object

allow_negative_support: bool
allow_positive_resistance: bool
lower_search_range: int
optimal_slope_mode: OptimalSlopeMode
selection_method: RSSPSelectionMethod
to_json_dict() dict
upper_search_range: int

quantlib.definition.line module

class quantlib.definition.line.Line(bar1: quantlib.definition.bar.Bar, bar2: quantlib.definition.bar.Bar, mode: quantlib.constant.enum.LineMode, search_distance: int)

Bases: object

bar1: Bar
bar2: Bar
distance: int
interpolate(xs: ndarray) ndarray

linear interpolation of y values given x values on the line

Parameters:

xs (np.ndarray) – x values

Returns:

interpolated y values

Return type:

np.ndarray

interpolate2(xs: ndarray) ndarray
mode: LineMode
optimal: bool = False
search_distance: int
slope: float
slope_perc: float
to_dict() dict
wrap: bool = False
y_diff: float
y_intercept: float
class quantlib.definition.line.LinePair(resistance: Line | None, support: Line | None)

Bases: object

A pair of resistance and support lines Sometimes we need a pair of resistance and support lines, and will be passed around, tuples and dicts are too flexible, we shouldn’t rely on memory to memorize index and line type mapping

quantlib.definition.line.line_intersect(l1: Line, l2: Line) tuple[float, float] | None

Calculate the intersection of 2 lines, return (x, y) if intersection exists, None otherwise

Parameters:
  • l1 (Line) – line 1

  • l2 (Line) – line 2

Returns:

intersection coordinate (None if no intersection/equal slope)

Return type:

tuple[float, float] | None

quantlib.definition.line.line_intersect_numeric(slope1: float, slope2: float, y_int1: float, y_int2: float) tuple[float, float]

y1 = ax + b y2 = cx + d ax+b = xc+d ax-cx=d-b x(a-c)=d-b

x=(d-b)/(a-c) y=a * (d-b)/(a-c) + b

Parameters:
  • slope1 (float) – slope of line 1

  • slope2 (float) – slope of line 2

  • y_int1 (float) – y-axis intercept of line 1

  • y_int2 (float) – y-axis intercept of line 2

Returns:

intersection coordinate

Return type:

tuple[float, float]

quantlib.definition.result module

class quantlib.definition.result.AnalysisResult(winner_line_pair: quantlib.definition.line.LinePair, residual_x: float, time_interval: quantlib.constant.enum.TimeInterval, round1_result_dict: dict[int, quantlib.definition.result.Round1Result], current_bar_idx: int)

Bases: object

current_bar_idx: int
property df: DataFrame
residual_x: float
round1_result_dict: dict[int, Round1Result]
time_interval: TimeInterval
winner_line_pair: LinePair
class quantlib.definition.result.Round1Result(resistance_lines: list[quantlib.definition.line.Line], support_lines: list[quantlib.definition.line.Line], winner_line_pair: quantlib.definition.line.LinePair, residual_x: float | None)

Bases: object

residual_x: float | None
resistance_lines: list[Line]
support_lines: list[Line]
winner_line_pair: LinePair

quantlib.definition.schema module

quantlib.definition.table module

class quantlib.definition.table.Table

Bases: BaseModel

Base class for all tables pandas DataFrame is flexible and hard to maintain in a large project when the data is passed around We define table classes as wrappers for pandas DataFrame In a table class, we define methods for each column of the DataFrame, so we can be confident that the column exists while coding. A table should be able to be constructed from a DataFrame with from_df method that each child class should implement

property df: DataFrame
abstract classmethod from_df(df: DataFrame)
property index: Series
model_config: ClassVar[ConfigDict] = {}

Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].

model_fields: ClassVar[dict[str, FieldInfo]] = {}

Metadata about the fields defined on the model, mapping of field names to [FieldInfo][pydantic.fields.FieldInfo].

This replaces Model.__fields__ from Pydantic V1.

model_post_init(__context: Any) None

This function is meant to behave like a BaseModel method to initialise private attributes.

It takes context as an argument since that’s what pydantic-core passes when calling it.

Parameters:
  • self – The BaseModel instance.

  • __context – The context.

Module contents