In this paper, I will attempt to establish a framework for representation in virtual worlds that may allow for input data from many different scales and virtual physics to be merged. For example, a typical virtual environment must effectively handle user input, sensor data, and virtual world physics all in real- time. Merging all of these data into a single interactive system requires that we adapt approaches from topological methods such as n-dimensional relativistic representation. A number of hypothetical examples will be provided throughout the paper to clarify technical challenges that need to be overcome to realize this vision. The long-term goal of this work is that truly invariant representations will ultimately result from establishing formal, inclusive relationships between these different domains. Using this framework, incomplete information in one or more domains can be compensated for by parallelism and mappings within the virtual world representation. To introduce this approach, I will review recent developments in embodiment, virtual world technology, and neuroscience relevant to the control of virtual worlds. The next step will be to borrow ideas from fields such as brain science, applied mathematics, and cosmology to give proper perspective to this approach. A simple demonstration will then be given using an intuitive example of physical relativism. Finally, future directions for the application of this method will be considered.