
Vector Time Series, Cross-Sectional Normalization, and Spherical Wandering
Introduction
In many modern applications—finance, signal processing, neuroscience, and machine learning—we encounter vector time series: sequences of observations where each time point is associated with a vector .
Unlike scalar time series, these objects encode both magnitude and directional information, and the interplay between the two often carries the signal of interest.
What is a Vector Time Series?
A vector time series is:
Each observation contains:
- A length (norm) , capturing scale or intensity
- A direction , capturing cross-sectional structure
These two components often behave very differently over time.
Cross-Sectional Normalization
A common transformation is cross-sectional normalization, where each vector is rescaled to unit length:
This maps the series onto the unit sphere , removing magnitude and preserving only direction.
When Should You Normalize?
Cross-sectional normalization is appropriate when:
1. Scale is Uninformative or Noisy
If variation in is dominated by noise or irrelevant factors, normalization helps isolate signal.
2. Only Relative Structure Matters
When the pattern across components is key (e.g., portfolio weights, normalized signals), direction is sufficient.
3. Multiplicative Effects Dominate
If the system has the form:
normalization removes the scalar factor , isolating .
4. Geometric Constraints Are Useful
Some models benefit from bounded inputs or angular structure (e.g., cosine similarity, spherical clustering).
When Not to Normalize
Avoid normalization when:
- Magnitude carries predictive information
- Scale itself is the signal (e.g., volatility, energy)
- Small vectors would be disproportionately amplified
A useful check: test whether alone has predictive power.
Geometry After Normalization
After normalization:
The system becomes purely directional:
- Euclidean distance → angular distance
- Covariance → alignment
- Dynamics → motion on the sphere
This is a shift from standard vector analysis to spherical geometry.
The No-Predictability Regime
Suppose there is no temporal predictability:
Then the process behaves like independent draws on the sphere.
Uniform Case
If:
then the sequence exhibits no structure, no persistence, and no preferred direction.
Spherical Wandering
In the absence of predictability, the normalized series undergoes what can be thought of as:
Spherical wandering
Each vector is simply another direction on the sphere, with no memory of the past.
Key Properties
1. Near Orthogonality
In high dimensions:
Successive vectors are nearly orthogonal.
2. Concentration of Measure
Random vectors on the sphere:
- Spread uniformly
- Appear almost independent
- Rarely align strongly
3. Illusion of Structure
Even random spherical data can:
- Look clustered in projections
- Show transient alignments
- Produce misleading patterns
Interpretation
Normalization fundamentally changes the modeling problem:
| Before Normalization | After Normalization |
|---|---|
| Magnitude + Direction | Direction only |
| Euclidean geometry | Spherical geometry |
| Scale-dependent models | Angular models |
If predictability disappears after normalization, then:
- The system has no directional memory
- Any signal lies in magnitudes or higher-order effects
- The observed process is effectively noise on the sphere
Conclusion
Cross-sectional normalization is more than a preprocessing step—it is a geometric transformation.
When predictability remains, it reveals directional structure.
When it vanishes, what remains is something simpler and more fundamental:
A sequence of vectors wandering across the sphere,
exploring directions without memory—
like a stochastic spherical code unfolding in time.