altair.XDatum

class altair.XDatum(datum, axis=Undefined, band=Undefined, impute=Undefined, scale=Undefined, stack=Undefined, type=Undefined, **kwds)

XDatum schema wrapper

Mapping(required=[])

Attributes
axisanyOf(Axis, None)

An object defining properties of axis’s gridlines, ticks and labels. If null, the axis for the encoding channel will be removed.

Default value: If undefined, default axis properties are applied.

See also: axis documentation.

bandfloat

For rect-based marks ( rect, bar, and image ), mark size relative to bandwidth of band scales, bins or time units. If set to 1, the mark size is set to the bandwidth, the bin interval, or the time unit interval. If set to 0.5, the mark size is half of the bandwidth or the time unit interval.

For other marks, relative position on a band of a stacked, binned, time unit or band scale. If set to 0, the marks will be positioned at the beginning of the band. If set to 0.5, the marks will be positioned in the middle of the band.

datumanyOf(PrimitiveValue, DateTime, ExprRef,
:class:`RepeatRef`)

A constant value in data domain.

imputeanyOf(ImputeParams, None)

An object defining the properties of the Impute Operation to be applied. The field value of the other positional channel is taken as key of the Impute Operation. The field of the color channel if specified is used as groupby of the Impute Operation.

See also: impute documentation.

scaleanyOf(Scale, None)

An object defining properties of the channel’s scale, which is the function that transforms values in the data domain (numbers, dates, strings, etc) to visual values (pixels, colors, sizes) of the encoding channels.

If null, the scale will be disabled and the data value will be directly encoded.

Default value: If undefined, default scale properties are applied.

See also: scale documentation.

stackanyOf(StackOffset, None, boolean)

Type of stacking offset if the field should be stacked. stack is only applicable for x, y, theta, and radius channels with continuous domains. For example, stack of y can be used to customize stacking for a vertical bar chart.

stack can be one of the following values: - "zero" or true: stacking with baseline offset at zero value of the scale (for creating typical stacked [bar](https://vega.github.io/vega-lite/docs/stack.html#bar) and area chart). - "normalize" - stacking with normalized domain (for creating normalized stacked bar and area charts.
- "center" - stacking with center baseline (for streamgraph ). - null or false - No-stacking. This will produce layered bar and area chart.

Default value: zero for plots with all of the following conditions are true: (1) the mark is bar, area, or arc ; (2) the stacked measure channel (x or y) has a linear scale; (3) At least one of non-position channels mapped to an unaggregated field that is different from x and y. Otherwise, null by default.

See also: stack documentation.

typeType

The type of measurement ( "quantitative", "temporal", "ordinal", or "nominal" ) for the encoded field or constant value ( datum ). It can also be a "geojson" type for encoding ‘geoshape’.

Vega-Lite automatically infers data types in many cases as discussed below. However, type is required for a field if: (1) the field is not nominal and the field encoding has no specified aggregate (except argmin and argmax ), bin, scale type, custom sort order, nor timeUnit or (2) if you wish to use an ordinal scale for a field with bin or timeUnit.

Default value:

1) For a data field, "nominal" is the default data type unless the field encoding has aggregate, channel, bin, scale type, sort, or timeUnit that satisfies the following criteria: - "quantitative" is the default type if (1) the encoded field contains bin or aggregate except "argmin" and "argmax", (2) the encoding channel is latitude or longitude channel or (3) if the specified scale type is a quantitative scale. - "temporal" is the default type if (1) the encoded field contains timeUnit or (2) the specified scale type is a time or utc scale - ordinal"" is the default type if (1) the encoded field contains a custom sort order, (2) the specified scale type is an ordinal/point/band scale, or (3) the encoding channel is order.

2) For a constant value in data domain ( datum ): - "quantitative" if the datum is a number - "nominal" if the datum is a string - "temporal" if the datum is a date time object

Note: - Data type describes the semantics of the data rather than the primitive data types (number, string, etc.). The same primitive data type can have different types of measurement. For example, numeric data can represent quantitative, ordinal, or nominal data. - Data values for a temporal field can be either a date-time string (e.g., "2015-03-07 12:32:17", "17:01", "2015-03-16". "2015" ) or a timestamp number (e.g., 1552199579097 ). - When using with bin, the type property can be either "quantitative" (for using a linear bin scale) or “ordinal” (for using an ordinal bin scale). - When using with timeUnit, the type property can be either "temporal" (default, for using a temporal scale) or “ordinal” (for using an ordinal scale). - When using with aggregate, the type property refers to the post-aggregation data type. For example, we can calculate count distinct of a categorical field "cat" using {"aggregate": "distinct", "field": "cat"}. The "type" of the aggregate output is "quantitative". - Secondary channels (e.g., x2, y2, xError, yError ) do not have type as they must have exactly the same type as their primary channels (e.g., x, y ).

See also: type documentation.

__init__(datum, axis=Undefined, band=Undefined, impute=Undefined, scale=Undefined, stack=Undefined, type=Undefined, **kwds)

Methods

__init__(datum[, axis, band, impute, scale, ...])

copy([deep, ignore])

Return a copy of the object

from_dict(dct[, validate, _wrapper_classes])

Construct class from a dictionary representation

from_json(json_string[, validate])

Instantiate the object from a valid JSON string

resolve_references([schema])

Resolve references in the context of this object's schema or root schema.

to_dict([validate, ignore, context])

Return a dictionary representation of the object

to_json([validate, ignore, context, indent, ...])

Emit the JSON representation for this object as a string.

validate(instance[, schema])

Validate the instance against the class schema in the context of the rootschema.

validate_property(name, value[, schema])

Validate a property against property schema in the context of the rootschema