Evolution has many different meanings, you have merely one, that is fine since when I discuss evolution with Darwinists I am discussing their view of evolution. When you say "evolution involves neither" you are making evolution to be absolute, as in the context of chance and randomness, the opposite of goal and direction. I only iterate this so that everyone knows what we are discussing, in which context and such.
A single-cell organism that "discovers" how to "steal" energy from its neighbors would no longer be limited to the fairly meager energy it could make itself but could greatly increase the energy available to it. The type III transport system is an admittedly advanced structure that could be used in perpetrating the "theft" of energy (it injects toxin into the victim). With an adequate supply of fuel and an extra part or two, the toxin transport device becomes an early form of flaggelum, and the "robber bacteria" now has an "outboard motor" and thereby a much greater fuel supply (and also the need for it).
Listing the steps leading to a sophisticated toxin delivery system that then could become a flagellum is beyond my capabilities, but I have faith (yeah "faith"
) in the scientific community and the scientific method that plausible steps do exist. A progression such as ...
- developing an enzyme that breaks down the victim's "skin" letting its insides leak out
- developing "stalks" tipped with the enzyme
- the stalks become active, "stabbing" the victim
- the stalks become hollow, allowing more effective transfer of the insides
- developing a suction device, food transfer is now active rather than passive
- ...
... again, I stress this is just an off the top of my hypothesis, but it is one possible progression.
This is true, but it presupposes both a goal and a design. Evolution involves neither.
This is a hypothetical scenario. Like all hypothesis, testing is required. Behe tested the IC hypothesis by reverse-engineering the bacterial flagellum. A part was removed causing the function as a whole to seize. What Behe has actually done is
reversed evolution because at one point in time a "component" had not been part of the system. Since evolution was reversed to a prior state we must ask, what is a rotor without a propeller and vice versa? This must be taken into the equation since that is the core of the IC argument.
The components of a bacterial flagellum are interdependent, they function as a "whole" to achieve a desired effect. The components exhibit strong asymmetry which distinguish it from symmetrical objects such as repetitive and predictable patterns in snow flakes. Objects that function while exhibiting high asymmetry, low predictability relative to itself are strong cases for design.
Listing the steps leading to a sophisticated toxin delivery system that then could become a flagellum is beyond my capabilities, but I have faith (yeah "faith"
) in the scientific community and the scientific method that plausible steps do exist. A progression such as ...
No, the scientific method is including all possible scenarios, that includes design.
Several years ago, I saw a description of a computer simulation of a random collection of very simple "organisms". Each "organism" was a simple software module which had several distinct movable parts. The locations of the movable parts, the plane(s) of motion of each part, and the degree(s) of freedom of each were controlled by a "DNA" value that defined the "organism". The "reproduction" of the "organisms" were slightly imperfect - there was a very small possibility of random errors so the "child" sometimes wasn't identical to its parent. The "rules" for "survival were simple, the chance of an individual "organism" "surviving" to reproduce was determined by its "propulsion score", which was based on the ability of the "organism" to produce effective propulsion. The initial values of the "DNA" variables were set randomly and the simulation was let run.
Well, I don't think we should concentrate on simulation software written by intelligent designers. Not for this thread at least. Evolutionary simulations have proven to require a significant amounts of preloaded information, such as the expected end result which would be inevitably "latch" onto given the algorithms objective. Never mind that there is no such thing as true randomness within computing, consider that most computer software written the "random" numbers are seeded off the current time such that each subsequent execution a new "random" number is generated within a range of values. This is just to show that nothing within a computer is random, the most you will achieve is pseudo-randomness. Consider that the computer itself was created to run algorithms that can do more then one thing at a time as many times as desirable. These simulations take advantage of computing capabilities to make their own point, that is cheating. As a rule of thumb, you don't take a designed computer and use it to prove chance and randomness by rewarding the digital organism through conditional statements and doing it enough times using loop constructs to reach the desired outcome/s. They know quite well what they are doing ahead of time, (they can anticipate) what the program is attempting to produce etc... because after all, the simulation is designed to do just that.
At first, the researchers had a population of floppers, twitchers, and do-nothings. After a number generations, the population had evolved a number of quite effective (and in some cases, quite unusual) methods of propulsion.
NASA had "evolved" a weird looking antenna which the reception that was produced outperformed all human designed antennas. Rest assured that they knew exactly what they were looking for ahead of time, the qualities they wanted in a antenna, what the program attempts to search for (ie: a better signal). The fact is that evolutionary simulations are nothing more then search algorithms, as they attempt to search through, buffer in and search from the point of the last buffered in result. I have programmed a few of these in c. One is atypical in that I wouldn't say it models evolution correctly, what it does show (never mind it takes on average 80 thousand trials to reach the an insignificant "target") nevertheless is that significant amounts of information are required before hand which must be preloaded into the program.
But I think this discussion is beyond IC nevertheless so I don't think it should be discussed.
You might say this experiment had a goal - propulsion, and that would be valid as the only criteria for survival was propulsion. But none of the resulting "organisms" were designed. In "real" world evolution, the only goal is survival, if propulsion increases chances of survival, so be it.
If it has a goal, then it is directed by the program itself. Survival is nothing but conditional constructs within the code that say if propulsion, select faster propulsion while filter out the less propulsive. That is selection, and that is a search that selects for more propulsion. The problem is that you have not explained propulsion itself. You have not explained propulsion itself because that was already explained in the programs objective (by a governing intelligence, ie: the programmers), nothing was "evolved" from scratch at all, at best modifications (select faster propulsion) were selected, not select propulsion.
And there are certainly plenty of examples of vestigial structures as well.
Yup, and most have been proven not to be vestigial at all.
I will respond to your other comment when I find the time.