Quote:
Originally posted by Cerek the Barbaric:
I was researching another Biblical question when I came across an article explaining that the theory of the Earth being several million years old was actually a relatively new development. This idea is less than 100yrs old IIRC. The article went on to say that the scientific community never considered the Earth to be that old until the Theory of Evolution began to gain popularity. I forget what the prevailing theory of the Earth's age was at that time, but it certainly wasn't "millions of years". But, as the Theory of Evolution gained popularity, scientists realized it could not be true unless they "extended" the current hypothesis of the Earth's age by several million years. Carbon dating became the "accepted dating methodology" shortly thereafter.
Surely there are other elements that are present in Earth's strata that have the same reliable half-life progression as Carbon, but that are found in objects that could conceivably pre-date life on Earth. Why do we not use one of those elements instead?
|
Indeed, most theories relating to the age of the Earth/life/universe rely upon mathematical formulae and assumptions that universal 'laws' of physics remain constant over time. For example, to determine how old a mountain range, desert or fjord might be a geologist might look at the directions and speeds at which tectonic plates move and sort've juxtapose on top of the that through mathematical means the amount of time necessary to create such a landscape. Likewise, measuring and calculating the rate at which hydrogen 'condenses', burns and 'disperses' is used in mathematical equations to 'deduce' the ages of suns and stars, which are basically just great big balls of burning hydrogen (sort of along the lines of "well, they're in
this form now, so in order to get like
'this' they must have done
'that' for such-and-such an amount of time". Does that make sense?). Most of it is indeed pseudo-assumptive guess-work, (for example: essentially assuming that hydrogen operates in the same way out in the vacuum of space as it does in a lab in chicago for instance [img]smile.gif[/img] ) and
alot of faith is put into the 'knowledge' of natural, physical law, ie: the much-heralded 'laws of physics'.
Now, 'law' is much too stern a word to be using in relation to these sorts of matters in my opinion. Perhaps 'rule', or 'guideline' of physics would be more appropriate. I mention this simply because by and large, things just don't seem to stay the same over time. There is some degree of organization in the way the multiverse seems to operate, certainly,
but to say that these methods of physical interaction and construction are unchanging
laws is where alot of people perhaps get a little ahead of themselves. They may have remained unchanged since humans began consciously noting them down in the past couple of thousand years, or at least that is how it would appear. But they
may , just may, have worked differently a long time ago in a galaxy far far away.

Indeed, just how
time itself operates is an entirely different topic of endless debate (as was demonstrated in a thread that Yorick began a month or two back [img]smile.gif[/img] ).
Ar-Cunin: yes, most definately the observation of various layers of mineral deposits has been accepted as a generally viable way of measuring earth-date. However, the question that remains is how can you be sure of the actual, numerical
date of the various layers? You can use mathematical techniques ("well, it takes such-and-such an amount of time for sedentary silt to settle here. The sedentary silt is
this thick, so it must have taken
this long" sort of thing), but is there any other way of doing it? There needs to be a base-assumption of how old a various layer must be in order for the layer-method to work. Although I am certainly no geologist, so please fill me in on the proper, established techniques. I'm interested in knowing.