TimeMatters

(c) 2010 Brian Mork. See ConditionsForUse.

ImageI do research and consulting to leverage time to advantage. Interestingly, in the Middle Ages, time was considered to be property owned by God. For example interest on money was never charged because that would be commercing with God's property. In contemporary society, financial communities are the only ones that keenly appreciate the utility of time. Perhaps if others were interested in being good stewards of God's property, they could see the value of time in other areas!

Themes to ponder.

Each has been, or could be developed into a complete study effort.
  • In the 2009 RAND Corp report titled "Cyberdeterrence and Cyberwar", author Martin Libicki identifies 4 or 5 foundational concepts. One is "Something that works today may not work tomorrow (indeed, precisely because it did work today)."
  • When data acquisition hardware isn't fast enough to slice up one transient event, thousands of time slices can be collected from thousands of different events. Assumption? All the events are the same. I learned of this first as boxcar averaging in Malmstadt's "Electronics for Scientists" course.
  • Meta-web pages accumulate and/or present information from other web pages. When the meta-page relies on underlying content rather than presence, the quality of the page erodes over time with no inherent change. (For example, a search engine doesn't depend on content, just accommodates that other pages exist. A meta-textbook requires orchestration of the underlying content.) Time changes the underlying network-available resources (web pages, data bases, user profiles, whatever). This will require constant administration. In the field of education, an author cannot create something and put it on a shelf like they can a current text book. What are the personnel and dollar resources necessary to keep such a web presence from "rusting" into oblivion?
  • Correlation functions (matching two different time sequences for a match) are at the heart of cell phone technology, distinguishing between different GPS satellites, computer WiFi? technology, and aviation radar.
  • The IF of any radio receiver can be thought of as a "single frequency" FFT - in other words, the large range of signals tangled in an inbound signal are selectively processed for just one oscillation rate.
  • Low band detection in first Comm Quarterly cover article - similar to IF detection, if the baseband RF is low enough in frequency - can be filtered as if it were an IF. This specific technical articles uses a computer to do the signal multiplication. Construction instructions are given for making your own computer controlled transceiver, using tremendously low band RF similar to the frequencies used by nuclear submarines. Data rates are insanely low, but notoriously reliable.
  • 1/sqrt(N) article in Comm Quarterly - averaging multiple samples of something lets you obtain a more precise measurement. If you have eliminated systematic offsets, this lets you get a more accurate determination. The random noise of any measurement process can be reduced by sqrt(N) by taking N measurements and averaging them together. If you have infinitely more time, any measurements can be collected with infinitely better precision. Ponder what this means about knowing God.
  • Defense News (July 14th, 2008, pg 42) reports on quantum lasers and quartz crystals being able to detect fingerprints laced with explosive material at 100 meters. The key philosophical design issue is synchronous excitation, and detection. Micrograms per square inch (hundreds times less than a fingerprint) detection of explosives such as TNT, RDX, and PETN. Laser pulses (must be scanned or at least multiple-selected wavelengths) at 32 KHz, resonant on crystal, converted to sound and electrical current. Download pdf (102 kb, 2 pages).
  • Research "call back command" architectures. Why?
    • Self-destruct commands where the command link might be destroyed before the test article.
    • Communication channel insensitivity
    • Timeliness of action (lack of latency)
    • Integrity of behavior (locality)
    • B-52 portion of the USA nuclear triad during the Cold War.
When does the factor of time change what is happening?
  • I happily use the SMB file protocol between Mandriva 2008 (Linux) and Windows XP. Then after a portion of time, perhaps after one of the system reboots, I no longer am able to access the Linux shares. If I reboot both computers, then accessing the Linux computer works fine after it promptly queries me for a username and password.
  • Two foundational concepts to understand computer and information security: Value and Time. Rather than build value, a target can be made unattractive by distibuting value. “The key is not to pour money into protecting information, but to develop a global approach to neutralizing its value. By creating secrets, we have created value, which is pursued by opportunists.”—John M. Brock. Instead of playing with value, one can improve security (or lower the effect of it failing) by making acquisition of information very slow. Every security is defeatible; it's just an issue of time. Make them spend so much time (effort) on the problem that they want to go elsewhere. The last of the 2-parameter play space is to make the information itself loose value very quickly. An example would be changing your password each day so that even if it's stolen, not much can be gotten with it.
    • Normal methods burden the authorized user - passwords are ops burdens, certificates are infrastructure burden.
    • Let them have pieces of information that are wrong or useless alone. Move the intelligence from the file content to the file meta-presence. Stenography is a nascent implementation of a similar concept. But think wider across a global network. We could quickly move to meta table-of-content schemes. But then we've just displaced the information value. How to diversify value with subsumption like a GPS constellation offers something that no individual satellite can. Or an anthill is unpredictable from studying ants.
  • From Test Pilot School Flight Qualities manual "Consider a reference frame that is located on the surface of the earth, on your desk for example, with axes that rotate with the earth. Is this an inertial reference frame? If you place a marble on your desk, apply a force to it, and measure its motion for ten seconds, you will find that 34 describes that motion quite accurately. What can we deduce from this? Over the course of a day, it is clear that distant stars will move 360 degrees relative to observers in this desktop reference frame. But during the 10 seconds in which you measured the motion of the marble, the stars would not have appeared to move at all. Clearly, this is not a truly inertial reference frame, because it is rotating and accelerating in inertial space. Yet during the ten seconds when measurements were being taken, it served as an inertial reference frame."
  • Handling Qualities determination of aircraft requires quantification of PIO characteristics. What if PIO occurs at one point in the task, and not in a different part of the task? How is the airplane quantitatively evaluated? PIO ratings can be given for each point in time, but a Cooper-Harper? rating is appropriate only for the entire task. How do we handle the difference in time domain slices of truth when the desired evaluation is over an entire aggregate task?
  • Traditional handling qualities of aircraft demonstrate instabilities when there are time delays on the order of milliseconds. Now consider remotely piloted vehicles (RPA). When there are 30 second time delay queuing a sensor, is this a handling qualities issue or a flight mission management issue?

Created by brian. Last Modification: Tuesday 23 of August, 2011 13:57:17 CDT by brian.