sci.geo.meteorology (Meteorology) (sci.geo.meteorology) For the discussion of meteorology and related topics.

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
  #51   Report Post  
Old December 15th 09, 01:25 PM posted to alt.global-warming,alt.politics.libertarian,sci.geo.meteorology,sci.physics
external usenet poster
 
First recorded activity by Weather-Banter: Feb 2009
Posts: 59
Default Can Global Warming Predictions be Tested with Observations ofthe Real Climate System?

I M @ good guy wrote:
On Mon, 14 Dec 2009 09:16:50 -0500, jmfbahciv jmfbahciv@aol wrote:

I M @ good guy wrote:
On Sat, 12 Dec 2009 09:53:43 -0500, jmfbahciv jmfbahciv@aol wrote:

snip clean off the tty

Which taxpayers do you think paid for the gathering of that data? Who
pays for the data the maritime business provides?
Don't know, don't care. Are you saying the IPCC is not tax-funded?

Where did our $50B go, then? I think grants are generally in the form
of contracts.


You don't even know how things get done.
Again, you might be surprised.
Not at all. You have no idea how much work is involved.

/BAH
You put way too much into what scientists
do and how little a non-professional can do and
understands.

Nothing has to be done to the data, just
make it available according to the law, and let
the recipients worry about the format.

That isn't what somebody insisted be done. I've been talking
about the suggestion that the data be prettied up and
documented with an explanation of the conclusions so that
a two-year-old can understand it. That last one is impossible
when the lab project is still in hypothesis-mode.

It isn't just the data that is subject to the
FOIA, it is the whole ball of wax that public
money paid for, professional work is supposed
to be notated, even within the text of papers,
hiding anything is either hiding something,
or some kind of perversion about importance.

The hiding is not the problem. The problem is politicians
using this as a basis for passing laws, tweaking economies,
stopping trade, and destroying nations and infrastructures.
another problem is a public who would rather believe in
conspiracies, unicorns, and outrageous fictions than
expend a tad of mental energy thinking.

If you want to solve this "cheating" problem, then solve
those two.

/BAH


I don't know what you mean by cheating, because
I can't believe that a professional would benefit from it.

Just following the law and complying with
FOIA requests should be enough.


Whose law?

This global warming thing, as defined by Al Gore, was a
world-wide scam. So which laws are you talking about?
The UN laws. Those designed to increase corruption,
not deal with real problems and their solutions which
will benefit normal populations.

/BAH

  #52   Report Post  
Old December 15th 09, 05:44 PM posted to alt.global-warming,alt.politics.libertarian,sci.geo.meteorology,sci.physics
external usenet poster
 
First recorded activity by Weather-Banter: Feb 2009
Posts: 197
Default Can Global Warming Predictions be Tested with Observations ofthe Real Climate System?

On Tue, 15 Dec 2009 08:22:09 -0500, jmfbahciv wrote:

Bill Ward wrote:
On Mon, 14 Dec 2009 09:57:19 -0500, jmfbahciv wrote:

Bill Ward wrote:
On Sat, 12 Dec 2009 09:53:43 -0500, jmfbahciv wrote:

Bill Ward wrote:
On Fri, 11 Dec 2009 09:03:37 -0500, jmfbahciv wrote:

Bill Ward wrote:
On Thu, 10 Dec 2009 08:01:19 -0500, jmfbahciv wrote:


snip clean my tty screen

I think that's your definition. I said,"I'm talking about
unadjusted digital versions of the 'raw data'", and you took
issue with it.
No, didn't talk about that. You want prettied up and reformatted
so anybody can read it and understand what it is. That takes code
and massages the raw data.
"Adjusting", or "massaging" is different from "reformatting".
All reformatting is massaging. If you are not doing a bit-for- bit
copy, you are massaging the file.
You use strange definitions, but OK.
It was a term used my biz which was hard/software development.


By your definition they'd be useless:

And now for the scores: 7 to 3; 2 to 1; and 21 to 7.

There's your "raw data", but it's not all that useful.
That is not raw data. You've typed it in and its format is ASCII.
You seem to be straining at gnats here. When does the mercury
position become "raw data" to you? When does it stop being "raw
data"?
Raw data is the original collection of facts. Prettying numbers up
to be displayed on a TTY screen or hardcopy paper requires massaging
if those bits are stored on computer gear.
I didn't see an answer to my question there. At what point does the
value representing the position of the Hg meniscus become "raw data"?
When it is recorded the first time.


"[O]riginal collection of facts" is a bit ambiguous. Is it when the
observer reads it, when he initially writes it down on a form, when
he keys it into a computer memory, when he saves it to permanent
media, when a herdcopy is printed...? If you're going to make up
definitions, you at least need to be specific and consistent.
RAw data depends on when, where and how the fact is collected. It is
as varied as the subjects. Data can be recorded with pen and paper in
a bound notebook. It can be collected with an analog device. It can
be collected with a digital device. It can be things in boxes,
scribbles on paper, holes in cards, bits on magnetic tape, bits on
disks, DECtapes, cassettes, CDs, or pictures. (I'm missing some.. oh,
ticks on stone or in the sand).



I define "raw data" as any copy of the original reading that carries
exactly the same information as the original reading, no matter what
format it's in.
I would not. A binary datum, 111100111, is not the same as the
number, 747, displayed in ASCII format on your TTY screen.


But carries the same, unadjusted information.


But the transformed bits are not the raw data. If there's been any kind
of error during the transforming or the second set of bits gets hit with
a cosmic ray, you can always go back to the raw data. That's why raw
data is kept raw. It's a sanity check.


If any information has changed, it's no longer raw data. If the
information is the same, but the data has been reformatted, labeled,
columnized, "prettied up", sorted, or any other information
preserving transformation, it's still raw data,
We never called that raw data.


OK, what do you want to call it? I'm easy.


Converted. copied. The set of data which has not been touched to
provide a sanity check in case the set you are working with has an
error.


OK. I would call that the original version of the raw data, but it's
just semantics.

since the information is
unchanged.


The data has been processed through some code which changed the format
it is stored in. It is no longer raw; raw implied no changed have
been made. Any reformatting requires changes. If any of the
reformatting code over time has any bug, (say one that sets a bit
which isn't detected), the outcome of analyses decades later would be
affected.


I agree if the information is changed the data is no longer raw data.
I would call it corrupted data. What do you want to call media that
carry exactly the same information as the raw data but in a different
format?


Reformatted. That implies the raw data has been massaged into a
different format. This massaging happens all the time depending on the
the usage and computer gear being used.


OK. Then post the reformatted, verified raw data, whatever you want to
call it, as long as it's usable and carries the same information as the
original raw data.

I would call it copies of the raw data, but you seem to prefer some
other unspecified term.


That's a much better phrase than insisting it's the raw data.


Do you see any problem with that?
Oh, yes. :-) Numbers are an especial problem. think of data
storages that varied from 8 bits/word to 72/word over three decades.
And now things are measured in "bytes" which vary with the phase of
the sun and the setting of the moon.


You seem to be focusing on the problems in ensuring the data is
transcribed properly into digital form.


Yup. The suggestion was to make the raw data available to the public.
There are problems with that and also takes a lot of manpower.

I'm not disagreeing with that,
I'm just saying no matter who uses the data, it must be transcribed
into a usable format.


Then it is not the raw data. The suggestion was to provide the raw
data. That means that the original collection of bits has to be copied
bit for bit with no modification. A lot of copy operations insert 0
zero bits for alignment.


I think the disagreement is simply semantic. You aren't considering
"copies of the raw data", "raw data". I do, as long as the copy is not
corrupt. I call the original raw data, "the original raw data", while
you insist it's the only "raw data".

If researchers are cutting corners on data integrity,
posting it on line would be one way to stop that. If they are doing it
right, then there should be no problems in making it available on line.


And I've been trying to tell you that it takes lots of time, money, and
babysitting to make that available. If you pass a rule that all data
has to be made public, resources, which would have normally been used
for the real research, will have to be used for babysitting those bits.
The real, science work will not get done.


Well, it wasn't posted, it was hidden, and now a lot of climate "science"
looks bogus. Perhaps reformatting, verifying and posting copies of the
raw data is a necessary part of the "real science". Think of it as
insurance.

If you want to call the verification and formatting "massaging",
fine, but if it's not done, the data is unusable.


Exactly. It's unusable to most people except those who run code
to use it as input (which is what scientists do).


And many others who might be seriously interested.


This thread has been talking about non-scientists having access to
any data which was collected; further constraints were declared that
the data had to be prettied up and completely described so that
anybody could access the data and know what it meant.


That would be what you were talking about, not me. All I insisted
was that the data be usable, which I think you are calling "prettied
up".


I think it was Eric who wanted stuff made public in an attempt to
prevent what happened with this global warming fiasco the politicians
have been milking for oodles of money.


Who wouldn't?


Sigh! Have I been wasting my time? Strawman.


Surely you're not happy with the picture the leaked emails reveal?

snip


That should be one of the
deliverables in the data collection contract.

You don't know what you're talking about.
And you're assuming facts not in evidence.
Actually, I'm not assuming anything. I'm talking about moving
bits and presenting them to non-expert readers. I know a lot
about this kind of thing because I did that kind of work for 25
years.
It's you that's worried about "non-expert" readers, not me. I
just want it accessible in a usable form. You don't need to
sugar coat it.
Your kind of usable form requires the raw data to be massaged
before storing it on a public forum.
I guess that depends on your definition of "massaging". As long as
it doesn't corrupt the data, I don't care what you call it, but the
simpler, the better.


You can't tell if the data's been corrupted if it's been
reformatted. You have to have QA specialist checking.

[this is a thread drift alert]

Shouldn't that be a routine procedure?


By whom? If the data you're using was collected by Leonardo, QA is a
tad problematic.

Or do you expect to use invalid
data to get valid results?


Think of the log tables which were produced and printed. If there is
one typo, and somebody used that number, to record a data set.


Why would you use a log table to record data? Logs would be used for
some sort of transformation, not raw data, and, unless you have an old
Pentium, not really an issue today.


You are wasting my time. This is a very serious matter and requires a
lot of thinking, discussion, and consideration. Your thinking style is
extremely short-term.


Now get
in your time machine and come back to today. The data set may be used
a input for a lot of analyses today. Now answer your question. My
answer would be yes; at some point you have to use what is available.


Then it would appear as an instrumental error, either as an outlier, or
buried in the noise.

These are aspects of bit recordings I've been trying to solve for
decades. All of my work was involved with shipping code to customers.
All of this discussion reminds me of the work I did. There are
CATCH-22s, deadly embraces, and impossibilities which is caused by
working with data which is invisible to human eye.


It sounds like you may be too close to be objective.


I have no idea what you mean. I know a lot about bit integrity and
shipping it. I also know how much work is required.


Then you should also know how important reliable data and programs are.
Taxpayers shouldn't be paying for crappy work. That's far more expensive
than doing it right.

I'm definitely not talking about a contract.

Then who's paying for it? If it's not taxpayers, then I really
don't care how it's done. If it is from taxes, then there
better be an enforceable contract in place, or we'll be right
back where we are now.

Contract law is different in each and every country.
So? There are still enforceable contracts. How would you do
international business without them?
You sign a contract for each country or entity in which you want
to do business.
Exactly. Why were you trying to make an issue of such an obvious
point?
You're the one who started to talk about contracts.

Which taxpayers do you think paid for the gathering of that
data? Who pays for the data the maritime business provides?
Don't know, don't care. Are you saying the IPCC is not
tax-funded?

Where did our $50B go, then? I think grants are generally in the
form of contracts.


You don't even know how things get done.

Again, you might be surprised.


Not at all. You have no idea how much work is involved.


We paid for a lot of work that now appears useless.


It is useless because everybody seems to have depended on one, and
only one, entity for their sources. That is a bloody procedural
problem in the science biz. There aren't independent sources nor
studies being used by the politicians nor the UN nor, science
conclusions. With the advent of the thingie called the WWW, the myths
become the facts at light speed.


Exactly. Researchers might be a little more careful if they know
someone else is watching. In fact, I'd say the way they treated Steve
M is proof positive they would.


Huh?


Sorry. I was referring to Stephen McIntyre of Climateaudit.org, who has
been trying to keep the Jones, Mann, Briffa et al gang honest for a
decade or so. I assumed you'd be familiar with him and his debunking of
the Hockey Stick icon. The leaked emails provide give insight into the
attitudes involved in trying to hide the data and algorithms from him.

If you're really not familiar with the issue, I'd recommend checking out
climateaudit.org. It might help you to understand the need to make
copies of raw data available online.

I'd rather pay for
careful work done in an open, transparent manner. It's cheaper than
having to redo it.

But that open, transparent manner is expensive, difficult, and
impossible (unless you develop a time machine) in some cases. Storing
data is not a trivial whether it's public or private.

Take a look at all the problems the open source biz has for computer
code. That's an "open, transparent manner" is not a trivial
endeavour.


I didn't say it would be easy, just necessary, if we're going to get
any valid results from the clown brigade.


It isn't necessary. You're just trying to fix a symptom which will not
fix the real problem but hide the real problem and prevent work from
getting done.


Review the leaked emails and climateaudit.org and see if you still think
so.


  #53   Report Post  
Old December 16th 09, 12:53 PM posted to alt.global-warming,alt.politics.libertarian,sci.geo.meteorology,sci.physics
external usenet poster
 
First recorded activity by Weather-Banter: Feb 2009
Posts: 59
Default Can Global Warming Predictions be Tested with Observations ofthe Real Climate System?

Bill Ward wrote:
On Tue, 15 Dec 2009 08:22:09 -0500, jmfbahciv wrote:

Bill Ward wrote:
On Mon, 14 Dec 2009 09:57:19 -0500, jmfbahciv wrote:

Bill Ward wrote:
On Sat, 12 Dec 2009 09:53:43 -0500, jmfbahciv wrote:

Bill Ward wrote:
On Fri, 11 Dec 2009 09:03:37 -0500, jmfbahciv wrote:

Bill Ward wrote:
On Thu, 10 Dec 2009 08:01:19 -0500, jmfbahciv wrote:


snip clean my tty screen

I think that's your definition. I said,"I'm talking about
unadjusted digital versions of the 'raw data'", and you took
issue with it.
No, didn't talk about that. You want prettied up and reformatted
so anybody can read it and understand what it is. That takes code
and massages the raw data.
"Adjusting", or "massaging" is different from "reformatting".
All reformatting is massaging. If you are not doing a bit-for- bit
copy, you are massaging the file.
You use strange definitions, but OK.
It was a term used my biz which was hard/software development.


By your definition they'd be useless:

And now for the scores: 7 to 3; 2 to 1; and 21 to 7.

There's your "raw data", but it's not all that useful.
That is not raw data. You've typed it in and its format is ASCII.
You seem to be straining at gnats here. When does the mercury
position become "raw data" to you? When does it stop being "raw
data"?
Raw data is the original collection of facts. Prettying numbers up
to be displayed on a TTY screen or hardcopy paper requires massaging
if those bits are stored on computer gear.
I didn't see an answer to my question there. At what point does the
value representing the position of the Hg meniscus become "raw data"?
When it is recorded the first time.


"[O]riginal collection of facts" is a bit ambiguous. Is it when the
observer reads it, when he initially writes it down on a form, when
he keys it into a computer memory, when he saves it to permanent
media, when a herdcopy is printed...? If you're going to make up
definitions, you at least need to be specific and consistent.
RAw data depends on when, where and how the fact is collected. It is
as varied as the subjects. Data can be recorded with pen and paper in
a bound notebook. It can be collected with an analog device. It can
be collected with a digital device. It can be things in boxes,
scribbles on paper, holes in cards, bits on magnetic tape, bits on
disks, DECtapes, cassettes, CDs, or pictures. (I'm missing some.. oh,
ticks on stone or in the sand).



I define "raw data" as any copy of the original reading that carries
exactly the same information as the original reading, no matter what
format it's in.
I would not. A binary datum, 111100111, is not the same as the
number, 747, displayed in ASCII format on your TTY screen.
But carries the same, unadjusted information.

But the transformed bits are not the raw data. If there's been any kind
of error during the transforming or the second set of bits gets hit with
a cosmic ray, you can always go back to the raw data. That's why raw
data is kept raw. It's a sanity check.


If any information has changed, it's no longer raw data. If the
information is the same, but the data has been reformatted, labeled,
columnized, "prettied up", sorted, or any other information
preserving transformation, it's still raw data,
We never called that raw data.
OK, what do you want to call it? I'm easy.

Converted. copied. The set of data which has not been touched to
provide a sanity check in case the set you are working with has an
error.


OK. I would call that the original version of the raw data, but it's
just semantics.

since the information is
unchanged.


The data has been processed through some code which changed the format
it is stored in. It is no longer raw; raw implied no changed have
been made. Any reformatting requires changes. If any of the
reformatting code over time has any bug, (say one that sets a bit
which isn't detected), the outcome of analyses decades later would be
affected.
I agree if the information is changed the data is no longer raw data.
I would call it corrupted data. What do you want to call media that
carry exactly the same information as the raw data but in a different
format?

Reformatted. That implies the raw data has been massaged into a
different format. This massaging happens all the time depending on the
the usage and computer gear being used.


OK. Then post the reformatted, verified raw data, whatever you want to
call it, as long as it's usable and carries the same information as the
original raw data.

I would call it copies of the raw data, but you seem to prefer some
other unspecified term.

That's a much better phrase than insisting it's the raw data.


Do you see any problem with that?
Oh, yes. :-) Numbers are an especial problem. think of data
storages that varied from 8 bits/word to 72/word over three decades.
And now things are measured in "bytes" which vary with the phase of
the sun and the setting of the moon.
You seem to be focusing on the problems in ensuring the data is
transcribed properly into digital form.

Yup. The suggestion was to make the raw data available to the public.
There are problems with that and also takes a lot of manpower.

I'm not disagreeing with that,
I'm just saying no matter who uses the data, it must be transcribed
into a usable format.

Then it is not the raw data. The suggestion was to provide the raw
data. That means that the original collection of bits has to be copied
bit for bit with no modification. A lot of copy operations insert 0
zero bits for alignment.


I think the disagreement is simply semantic.


No, it's not.

You aren't considering
"copies of the raw data", "raw data". I do, as long as the copy is not
corrupt.


How do you know that? You can't unless the person who copied it
did an BINCOM or something to verify that no bits were changed and
no fills were inserted.

I call the original raw data, "the original raw data", while
you insist it's the only "raw data".


People are lazy and don't say "original raw data"; they say raw data.
The term has an implied meaning so that 5 hours of discussion doesn't
have to be done to clarify the meaning.

You sound like one of our bloody editors who insisted, until I raised
the roof, that all occurences of CPU in our documentation be changed
to central processing unit.


If researchers are cutting corners on data integrity,
posting it on line would be one way to stop that. If they are doing it
right, then there should be no problems in making it available on line.


And I've been trying to tell you that it takes lots of time, money, and
babysitting to make that available. If you pass a rule that all data
has to be made public, resources, which would have normally been used
for the real research, will have to be used for babysitting those bits.
The real, science work will not get done.


Well, it wasn't posted, it was hidden, and now a lot of climate "science"
looks bogus. Perhaps reformatting, verifying and posting copies of the
raw data is a necessary part of the "real science". Think of it as
insurance.


But there existed other scientists who did not agree and said so. That
one place not providing their data when requested is not the problem.
Fixing them and their anal retention will not fix the real problem.


If you want to call the verification and formatting "massaging",
fine, but if it's not done, the data is unusable.

Exactly. It's unusable to most people except those who run code
to use it as input (which is what scientists do).
And many others who might be seriously interested.
This thread has been talking about non-scientists having access to
any data which was collected; further constraints were declared that
the data had to be prettied up and completely described so that
anybody could access the data and know what it meant.


That would be what you were talking about, not me. All I insisted
was that the data be usable, which I think you are calling "prettied
up".


I think it was Eric who wanted stuff made public in an attempt to
prevent what happened with this global warming fiasco the politicians
have been milking for oodles of money.
Who wouldn't?

Sigh! Have I been wasting my time? Strawman.


Surely you're not happy with the picture the leaked emails reveal?


I didn't need to wait for those emails to hit the net. I've been
suspicious for years. When the phrase "majority of scientists"
or "scientists believe" is used to promote a political agenda,
I smell a big rat. that's been happening since (what?) the mid-90s.

Supporting the assertion with weather facts was another red flag.
When the J.Q. Public begins to make jokes about the weather
with comments about Global Warming, I become extremely skeptical
about the hypothesis.



snip


That should be one of the
deliverables in the data collection contract.

You don't know what you're talking about.
And you're assuming facts not in evidence.
Actually, I'm not assuming anything. I'm talking about moving
bits and presenting them to non-expert readers. I know a lot
about this kind of thing because I did that kind of work for 25
years.
It's you that's worried about "non-expert" readers, not me. I
just want it accessible in a usable form. You don't need to
sugar coat it.
Your kind of usable form requires the raw data to be massaged
before storing it on a public forum.
I guess that depends on your definition of "massaging". As long as
it doesn't corrupt the data, I don't care what you call it, but the
simpler, the better.
You can't tell if the data's been corrupted if it's been
reformatted. You have to have QA specialist checking.
[this is a thread drift alert]

Shouldn't that be a routine procedure?


By whom? If the data you're using was collected by Leonardo, QA is a
tad problematic.

Or do you expect to use invalid
data to get valid results?


Think of the log tables which were produced and printed. If there is
one typo, and somebody used that number, to record a data set.
Why would you use a log table to record data? Logs would be used for
some sort of transformation, not raw data, and, unless you have an old
Pentium, not really an issue today.

You are wasting my time. This is a very serious matter and requires a
lot of thinking, discussion, and consideration. Your thinking style is
extremely short-term.


Now get
in your time machine and come back to today. The data set may be used
a input for a lot of analyses today. Now answer your question. My
answer would be yes; at some point you have to use what is available.
Then it would appear as an instrumental error, either as an outlier, or
buried in the noise.

These are aspects of bit recordings I've been trying to solve for
decades. All of my work was involved with shipping code to customers.
All of this discussion reminds me of the work I did. There are
CATCH-22s, deadly embraces, and impossibilities which is caused by
working with data which is invisible to human eye.
It sounds like you may be too close to be objective.

I have no idea what you mean. I know a lot about bit integrity and
shipping it. I also know how much work is required.


Then you should also know how important reliable data and programs are.


Yes, I do know.

Taxpayers shouldn't be paying for crappy work.


How are your taxes paying for East Anglica?

That's far more expensive
than doing it right.


Define "right". So far, you've insisted that the public be the
sanity check. That won't work when with any science endeavor.
In fact, it will stop all science because the "rules" will
eventually state that no science work can be done unless the
public approves. That puts the decisions right back into
Congress and other legislative bodies. The only countries
who will be doing science are the communist-based countries
because they don't give a **** about public opinion at the
detail level.


I'm definitely not talking about a contract.

Then who's paying for it? If it's not taxpayers, then I really
don't care how it's done. If it is from taxes, then there
better be an enforceable contract in place, or we'll be right
back where we are now.

Contract law is different in each and every country.
So? There are still enforceable contracts. How would you do
international business without them?
You sign a contract for each country or entity in which you want
to do business.
Exactly. Why were you trying to make an issue of such an obvious
point?
You're the one who started to talk about contracts.

Which taxpayers do you think paid for the gathering of that
data? Who pays for the data the maritime business provides?
Don't know, don't care. Are you saying the IPCC is not
tax-funded?

Where did our $50B go, then? I think grants are generally in the
form of contracts.


You don't even know how things get done.

Again, you might be surprised.


Not at all. You have no idea how much work is involved.


We paid for a lot of work that now appears useless.


It is useless because everybody seems to have depended on one, and
only one, entity for their sources. That is a bloody procedural
problem in the science biz. There aren't independent sources nor
studies being used by the politicians nor the UN nor, science
conclusions. With the advent of the thingie called the WWW, the myths
become the facts at light speed.
Exactly. Researchers might be a little more careful if they know
someone else is watching. In fact, I'd say the way they treated Steve
M is proof positive they would.

Huh?


Sorry. I was referring to Stephen McIntyre of Climateaudit.org, who has
been trying to keep the Jones, Mann, Briffa et al gang honest for a
decade or so. I assumed you'd be familiar with him and his debunking of
the Hockey Stick icon. The leaked emails provide give insight into the
attitudes involved in trying to hide the data and algorithms from him.


That is one place and one small group of people. Are you implying that
all groups in the world are similar?


If you're really not familiar with the issue, I'd recommend checking out
climateaudit.org. It might help you to understand the need to make
copies of raw data available online.


I don't need to do that; I am able to think.


I'd rather pay for
careful work done in an open, transparent manner. It's cheaper than
having to redo it.
But that open, transparent manner is expensive, difficult, and
impossible (unless you develop a time machine) in some cases. Storing
data is not a trivial whether it's public or private.

Take a look at all the problems the open source biz has for computer
code. That's an "open, transparent manner" is not a trivial
endeavour.
I didn't say it would be easy, just necessary, if we're going to get
any valid results from the clown brigade.

It isn't necessary. You're just trying to fix a symptom which will not
fix the real problem but hide the real problem and prevent work from
getting done.


Review the leaked emails and climateaudit.org and see if you still think
so.


Which "leaded" emails. You're assuming conspiracies all over. Why so
seem to be suddenly surprised that their hypothesis about global warming
may not be valid?

/BAH

  #54   Report Post  
Old December 16th 09, 11:43 PM posted to alt.global-warming,alt.politics.libertarian,sci.geo.meteorology,sci.physics
external usenet poster
 
First recorded activity by Weather-Banter: Feb 2009
Posts: 197
Default Can Global Warming Predictions be Tested with Observations ofthe Real Climate System?

On Wed, 16 Dec 2009 07:53:38 -0500, jmfbahciv wrote:

Bill Ward wrote:
On Tue, 15 Dec 2009 08:22:09 -0500, jmfbahciv wrote:

Bill Ward wrote:
On Mon, 14 Dec 2009 09:57:19 -0500, jmfbahciv wrote:

Bill Ward wrote:
On Sat, 12 Dec 2009 09:53:43 -0500, jmfbahciv wrote:

Bill Ward wrote:
On Fri, 11 Dec 2009 09:03:37 -0500, jmfbahciv wrote:

Bill Ward wrote:
On Thu, 10 Dec 2009 08:01:19 -0500, jmfbahciv wrote:


snip clean my tty screen

I think that's your definition. I said,"I'm talking about
unadjusted digital versions of the 'raw data'", and you took
issue with it.
No, didn't talk about that. You want prettied up and
reformatted so anybody can read it and understand what it is.
That takes code and massages the raw data.
"Adjusting", or "massaging" is different from "reformatting".
All reformatting is massaging. If you are not doing a bit-for-
bit copy, you are massaging the file.
You use strange definitions, but OK.
It was a term used my biz which was hard/software development.


By your definition they'd be useless:

And now for the scores: 7 to 3; 2 to 1; and 21 to 7.

There's your "raw data", but it's not all that useful.
That is not raw data. You've typed it in and its format is
ASCII.
You seem to be straining at gnats here. When does the mercury
position become "raw data" to you? When does it stop being "raw
data"?
Raw data is the original collection of facts. Prettying numbers
up to be displayed on a TTY screen or hardcopy paper requires
massaging if those bits are stored on computer gear.
I didn't see an answer to my question there. At what point does
the value representing the position of the Hg meniscus become "raw
data"?
When it is recorded the first time.


"[O]riginal collection of facts" is a bit ambiguous. Is it when
the observer reads it, when he initially writes it down on a form,
when he keys it into a computer memory, when he saves it to
permanent media, when a herdcopy is printed...? If you're going
to make up definitions, you at least need to be specific and
consistent.
RAw data depends on when, where and how the fact is collected. It
is as varied as the subjects. Data can be recorded with pen and
paper in a bound notebook. It can be collected with an analog
device. It can be collected with a digital device. It can be things
in boxes, scribbles on paper, holes in cards, bits on magnetic tape,
bits on disks, DECtapes, cassettes, CDs, or pictures. (I'm missing
some.. oh, ticks on stone or in the sand).



I define "raw data" as any copy of the original reading that
carries exactly the same information as the original reading, no
matter what format it's in.
I would not. A binary datum, 111100111, is not the same as the
number, 747, displayed in ASCII format on your TTY screen.
But carries the same, unadjusted information.
But the transformed bits are not the raw data. If there's been any
kind of error during the transforming or the second set of bits gets
hit with a cosmic ray, you can always go back to the raw data. That's
why raw data is kept raw. It's a sanity check.


If any information has changed, it's no longer raw data. If the
information is the same, but the data has been reformatted,
labeled, columnized, "prettied up", sorted, or any other
information preserving transformation, it's still raw data,
We never called that raw data.
OK, what do you want to call it? I'm easy.
Converted. copied. The set of data which has not been touched to
provide a sanity check in case the set you are working with has an
error.


OK. I would call that the original version of the raw data, but it's
just semantics.

since the information is
unchanged.


The data has been processed through some code which changed the
format it is stored in. It is no longer raw; raw implied no changed
have been made. Any reformatting requires changes. If any of the
reformatting code over time has any bug, (say one that sets a bit
which isn't detected), the outcome of analyses decades later would
be affected.
I agree if the information is changed the data is no longer raw data.
I would call it corrupted data. What do you want to call media that
carry exactly the same information as the raw data but in a different
format?
Reformatted. That implies the raw data has been massaged into a
different format. This massaging happens all the time depending on
the the usage and computer gear being used.


OK. Then post the reformatted, verified raw data, whatever you want to
call it, as long as it's usable and carries the same information as the
original raw data.

I would call it copies of the raw data, but you seem to prefer some
other unspecified term.
That's a much better phrase than insisting it's the raw data.


Do you see any problem with that?
Oh, yes. :-) Numbers are an especial problem. think of data
storages that varied from 8 bits/word to 72/word over three decades.
And now things are measured in "bytes" which vary with the phase of
the sun and the setting of the moon.
You seem to be focusing on the problems in ensuring the data is
transcribed properly into digital form.
Yup. The suggestion was to make the raw data available to the public.
There are problems with that and also takes a lot of manpower.

I'm not disagreeing with that,
I'm just saying no matter who uses the data, it must be transcribed
into a usable format.
Then it is not the raw data. The suggestion was to provide the raw
data. That means that the original collection of bits has to be
copied bit for bit with no modification. A lot of copy operations
insert 0 zero bits for alignment.


I think the disagreement is simply semantic.


No, it's not.

You aren't considering
"copies of the raw data", "raw data". I do, as long as the copy is not
corrupt.


How do you know that? You can't unless the person who copied it did an
BINCOM or something to verify that no bits were changed and no fills
were inserted.


"Read after write" verification has been around quite a while. My system
does it automatically. Doesn't yours?

I call the original raw data, "the original raw data", while you insist
it's the only "raw data".


People are lazy and don't say "original raw data"; they say raw data.
The term has an implied meaning so that 5 hours of discussion doesn't
have to be done to clarify the meaning.

You sound like one of our bloody editors who insisted, until I raised
the roof, that all occurences of CPU in our documentation be changed to
central processing unit.


As I said, it's just semantics.

If researchers are cutting corners on data integrity, posting it on
line would be one way to stop that. If they are doing it right, then
there should be no problems in making it available on line.


And I've been trying to tell you that it takes lots of time, money,
and babysitting to make that available. If you pass a rule that all
data has to be made public, resources, which would have normally been
used for the real research, will have to be used for babysitting those
bits. The real, science work will not get done.


Well, it wasn't posted, it was hidden, and now a lot of climate
"science" looks bogus. Perhaps reformatting, verifying and posting
copies of the raw data is a necessary part of the "real science".
Think of it as insurance.


But there existed other scientists who did not agree and said so. That
one place not providing their data when requested is not the problem.
Fixing them and their anal retention will not fix the real problem.


What do you think is the "real problem", then? Can it be solved without
changing human nature?

If you want to call the verification and formatting
"massaging", fine, but if it's not done, the data is unusable.

Exactly. It's unusable to most people except those who run code
to use it as input (which is what scientists do).
And many others who might be seriously interested.
This thread has been talking about non-scientists having access to
any data which was collected; further constraints were declared
that the data had to be prettied up and completely described so
that anybody could access the data and know what it meant.


That would be what you were talking about, not me. All I insisted
was that the data be usable, which I think you are calling
"prettied up".


I think it was Eric who wanted stuff made public in an attempt to
prevent what happened with this global warming fiasco the
politicians have been milking for oodles of money.


Who wouldn't?


Sigh! Have I been wasting my time? Strawman.


Surely you're not happy with the picture the leaked emails reveal?


I didn't need to wait for those emails to hit the net. I've been
suspicious for years. When the phrase "majority of scientists" or
"scientists believe" is used to promote a political agenda, I smell a
big rat. that's been happening since (what?) the mid-90s.

Supporting the assertion with weather facts was another red flag. When
the J.Q. Public begins to make jokes about the weather with comments
about Global Warming, I become extremely skeptical about the hypothesis.



snip


That should be one of the
deliverables in the data collection contract.

You don't know what you're talking about.


And you're assuming facts not in evidence.


Actually, I'm not assuming anything. I'm talking about moving
bits and presenting them to non-expert readers. I know a lot
about this kind of thing because I did that kind of work for
25 years.


It's you that's worried about "non-expert" readers, not me. I
just want it accessible in a usable form. You don't need to
sugar coat it.


Your kind of usable form requires the raw data to be massaged
before storing it on a public forum.


I guess that depends on your definition of "massaging". As long
as it doesn't corrupt the data, I don't care what you call it,
but the simpler, the better.


You can't tell if the data's been corrupted if it's been
reformatted. You have to have QA specialist checking.


[this is a thread drift alert]

Shouldn't that be a routine procedure?


By whom? If the data you're using was collected by Leonardo, QA is
a tad problematic.

Or do you expect to use invalid
data to get valid results?


Think of the log tables which were produced and printed. If there
is one typo, and somebody used that number, to record a data set.


Why would you use a log table to record data? Logs would be used for
some sort of transformation, not raw data, and, unless you have an
old Pentium, not really an issue today.


You are wasting my time. This is a very serious matter and requires a
lot of thinking, discussion, and consideration. Your thinking style
is extremely short-term.


Now get
in your time machine and come back to today. The data set may be
used a input for a lot of analyses today. Now answer your question.
My answer would be yes; at some point you have to use what is
available.


Then it would appear as an instrumental error, either as an outlier,
or buried in the noise.

These are aspects of bit recordings I've been trying to solve for
decades. All of my work was involved with shipping code to
customers. All of this discussion reminds me of the work I did.
There are CATCH-22s, deadly embraces, and impossibilities which is
caused by working with data which is invisible to human eye.


It sounds like you may be too close to be objective.


I have no idea what you mean. I know a lot about bit integrity and
shipping it. I also know how much work is required.


Then you should also know how important reliable data and programs are.


Yes, I do know.

Taxpayers shouldn't be paying for crappy work.


How are your taxes paying for East Anglica?


Grants from the US government.

That's far more expensive
than doing it right.


Define "right". So far, you've insisted that the public be the sanity
check. That won't work when with any science endeavor. In fact, it will
stop all science because the "rules" will eventually state that no
science work can be done unless the public approves.


You know, I try to be just as cynical as I can, but I think you have me
beat there.

That puts the
decisions right back into Congress and other legislative bodies. The
only countries who will be doing science are the communist-based
countries because they don't give a **** about public opinion at the
detail level.


Dictatorships are always more efficient in the short run. Long term,
freedom wins.

I'm definitely not talking about a contract.

Then who's paying for it? If it's not taxpayers, then I
really don't care how it's done. If it is from taxes, then
there better be an enforceable contract in place, or we'll be
right back where we are now.

Contract law is different in each and every country.
So? There are still enforceable contracts. How would you do
international business without them?
You sign a contract for each country or entity in which you want
to do business.
Exactly. Why were you trying to make an issue of such an obvious
point?
You're the one who started to talk about contracts.

Which taxpayers do you think paid for the gathering of that
data? Who pays for the data the maritime business provides?
Don't know, don't care. Are you saying the IPCC is not
tax-funded?

Where did our $50B go, then? I think grants are generally in
the form of contracts.


You don't even know how things get done.

Again, you might be surprised.


Not at all. You have no idea how much work is involved.


We paid for a lot of work that now appears useless.


It is useless because everybody seems to have depended on one, and
only one, entity for their sources. That is a bloody procedural
problem in the science biz. There aren't independent sources nor
studies being used by the politicians nor the UN nor, science
conclusions. With the advent of the thingie called the WWW, the
myths become the facts at light speed.
Exactly. Researchers might be a little more careful if they know
someone else is watching. In fact, I'd say the way they treated
Steve M is proof positive they would.
Huh?


Sorry. I was referring to Stephen McIntyre of Climateaudit.org, who
has been trying to keep the Jones, Mann, Briffa et al gang honest for a
decade or so. I assumed you'd be familiar with him and his debunking
of the Hockey Stick icon. The leaked emails provide give insight into
the attitudes involved in trying to hide the data and algorithms from
him.


That is one place and one small group of people. Are you implying that
all groups in the world are similar?


In the climate area, yes. The IPCC is the dictator, and the US has been
following along, paying the bills. No serious discussion has been
allowed.

If you're really not familiar with the issue, I'd recommend checking
out climateaudit.org. It might help you to understand the need to make
copies of raw data available online.


I don't need to do that; I am able to think.


Climateaudit will give you something to think about. It doesn't try to
substitute for your own thoughts.

I'd rather pay for
careful work done in an open, transparent manner. It's cheaper
than having to redo it.


But that open, transparent manner is expensive, difficult, and
impossible (unless you develop a time machine) in some cases.
Storing data is not a trivial whether it's public or private.

Take a look at all the problems the open source biz has for computer
code. That's an "open, transparent manner" is not a trivial
endeavour.


I didn't say it would be easy, just necessary, if we're going to get
any valid results from the clown brigade.


It isn't necessary. You're just trying to fix a symptom which will
not fix the real problem but hide the real problem and prevent work
from getting done.


Review the leaked emails and climateaudit.org and see if you still
think so.


Which "leaded" emails. You're assuming conspiracies all over. Why so
seem to be suddenly surprised that their hypothesis about global warming
may not be valid?


Well, the leaked emails seem to provide evidence for the chicanery we've
all suspected. Some of it looks illegal, perhaps felonious.

  #55   Report Post  
Old December 17th 09, 01:20 PM posted to alt.global-warming,alt.politics.libertarian,sci.geo.meteorology,sci.physics
external usenet poster
 
First recorded activity by Weather-Banter: Feb 2009
Posts: 59
Default Can Global Warming Predictions be Tested with Observations ofthe Real Climate System?

Bill Ward wrote:
On Wed, 16 Dec 2009 07:53:38 -0500, jmfbahciv wrote:

Bill Ward wrote:
On Tue, 15 Dec 2009 08:22:09 -0500, jmfbahciv wrote:

Bill Ward wrote:
On Mon, 14 Dec 2009 09:57:19 -0500, jmfbahciv wrote:

Bill Ward wrote:
On Sat, 12 Dec 2009 09:53:43 -0500, jmfbahciv wrote:

Bill Ward wrote:
On Fri, 11 Dec 2009 09:03:37 -0500, jmfbahciv wrote:

Bill Ward wrote:

snip

I'm not disagreeing with that,
I'm just saying no matter who uses the data, it must be transcribed
into a usable format.
Then it is not the raw data. The suggestion was to provide the raw
data. That means that the original collection of bits has to be
copied bit for bit with no modification. A lot of copy operations
insert 0 zero bits for alignment.
I think the disagreement is simply semantic.

No, it's not.

You aren't considering
"copies of the raw data", "raw data". I do, as long as the copy is not
corrupt.

How do you know that? You can't unless the person who copied it did an
BINCOM or something to verify that no bits were changed and no fills
were inserted.


"Read after write" verification has been around quite a while. My system
does it automatically. Doesn't yours?


Does yours include the nulls when comparing one file to the other or
skip them? I'm stating that you have to be careful. If you're
moving a binary data file from a 16-bit to 72-bit machine, you'll
have problems. You'll have a lot more problems if you're moving
a binary data file from a 72-bit to 16-bit machine. You'll have
even more problems if some of the collection was done using
single-word floating point format and later collections was done using
double-word floating point format. Mixed mode data collections means
that the raw data had better not be something that had been modified
and this included null fills.



I call the original raw data, "the original raw data", while you insist
it's the only "raw data".

People are lazy and don't say "original raw data"; they say raw data.
The term has an implied meaning so that 5 hours of discussion doesn't
have to be done to clarify the meaning.

You sound like one of our bloody editors who insisted, until I raised
the roof, that all occurences of CPU in our documentation be changed to
central processing unit.


As I said, it's just semantics.


It is not just semantics. The terms we use in the computer biz implies
strict specifications. If they didn't, nothing was have gotten done.
The same thing happens in math and science when you say the word
derivative.


If researchers are cutting corners on data integrity, posting it on
line would be one way to stop that. If they are doing it right, then
there should be no problems in making it available on line.


And I've been trying to tell you that it takes lots of time, money,
and babysitting to make that available. If you pass a rule that all
data has to be made public, resources, which would have normally been
used for the real research, will have to be used for babysitting those
bits. The real, science work will not get done.
Well, it wasn't posted, it was hidden, and now a lot of climate
"science" looks bogus. Perhaps reformatting, verifying and posting
copies of the raw data is a necessary part of the "real science".
Think of it as insurance.

But there existed other scientists who did not agree and said so. That
one place not providing their data when requested is not the problem.
Fixing them and their anal retention will not fix the real problem.


What do you think is the "real problem", then? Can it be solved without
changing human nature?


Political corruption that is out of control. It can be solved but
the side effects are unappetizing.

snip

Taxpayers shouldn't be paying for crappy work.

How are your taxes paying for East Anglica?


Grants from the US government.


Huh?


That's far more expensive
than doing it right.

Define "right". So far, you've insisted that the public be the sanity
check. That won't work when with any science endeavor. In fact, it will
stop all science because the "rules" will eventually state that no
science work can be done unless the public approves.


You know, I try to be just as cynical as I can, but I think you have me
beat there.


I'm not being cynical. I'm stating what happens when this kind of fit
hits the shan.


That puts the
decisions right back into Congress and other legislative bodies. The
only countries who will be doing science are the communist-based
countries because they don't give a **** about public opinion at the
detail level.


Dictatorships are always more efficient in the short run. Long term,
freedom wins.



No, it doesn't. Unchecked freedom produces anarchies and destruction
of civilizations.

snip

Sorry. I was referring to Stephen McIntyre of Climateaudit.org, who
has been trying to keep the Jones, Mann, Briffa et al gang honest for a
decade or so. I assumed you'd be familiar with him and his debunking
of the Hockey Stick icon. The leaked emails provide give insight into
the attitudes involved in trying to hide the data and algorithms from
him.

That is one place and one small group of people. Are you implying that
all groups in the world are similar?


In the climate area, yes. The IPCC is the dictator, and the US has been
following along, paying the bills. No serious discussion has been
allowed.


I haven't noticed that the articles I've read in _Science News_ were
all vetted by those people. How did the US pay those bills? Who
did they write the check to?


If you're really not familiar with the issue, I'd recommend checking
out climateaudit.org. It might help you to understand the need to make
copies of raw data available online.

I don't need to do that; I am able to think.


Climateaudit will give you something to think about. It doesn't try to
substitute for your own thoughts.


Are you really suggesting that I not use analytical thinking and allow
somebody to do that work for me? Your attitude is part of the problem.

Which "leaded" emails. You're assuming conspiracies all over. Why so
seem to be suddenly surprised that their hypothesis about global warming
may not be valid?


Well, the leaked emails seem to provide evidence for the chicanery we've
all suspected. Some of it looks illegal, perhaps felonious.


I thought I've read here that they weren't leaked. Are you trying to
make a conspiracy out of the mess? That just makes more messes
and doesn't deal with the original.

/BAH


  #56   Report Post  
Old December 17th 09, 02:20 PM posted to alt.global-warming,alt.politics.libertarian,sci.geo.meteorology,sci.physics
external usenet poster
 
First recorded activity by Weather-Banter: Nov 2003
Posts: 935
Default Can Global Warming Predictions be Tested with Observations ofthe Real Climate System?

jmfbahciv wrote:
Bill Ward wrote:
On Wed, 16 Dec 2009 07:53:38 -0500, jmfbahciv wrote:

Bill Ward wrote:
On Tue, 15 Dec 2009 08:22:09 -0500, jmfbahciv wrote:

Bill Ward wrote:
On Mon, 14 Dec 2009 09:57:19 -0500, jmfbahciv wrote:

Bill Ward wrote:
On Sat, 12 Dec 2009 09:53:43 -0500, jmfbahciv wrote:

Bill Ward wrote:
On Fri, 11 Dec 2009 09:03:37 -0500, jmfbahciv wrote:

Bill Ward wrote:

snip

I'm not disagreeing with that,
I'm just saying no matter who uses the data, it must be transcribed
into a usable format.
Then it is not the raw data. The suggestion was to provide the raw
data. That means that the original collection of bits has to be
copied bit for bit with no modification. A lot of copy operations
insert 0 zero bits for alignment.
I think the disagreement is simply semantic.
No, it's not.

You aren't considering
"copies of the raw data", "raw data". I do, as long as the copy is not
corrupt.
How do you know that? You can't unless the person who copied it did an
BINCOM or something to verify that no bits were changed and no fills
were inserted.


"Read after write" verification has been around quite a while. My
system does it automatically. Doesn't yours?


Does yours include the nulls when comparing one file to the other or
skip them? I'm stating that you have to be careful. If you're
moving a binary data file from a 16-bit to 72-bit machine, you'll
have problems. You'll have a lot more problems if you're moving
a binary data file from a 72-bit to 16-bit machine. You'll have
even more problems if some of the collection was done using
single-word floating point format and later collections was done using
double-word floating point format. Mixed mode data collections means
that the raw data had better not be something that had been modified
and this included null fills.


In the old days raw binary floating point was a nightmare to transport
since almost every manufacturer had a slightly different machine
representation before IEEE standardisation. I recall the pain and
suffering deciding what to do about certain states that could occur in
the original raw data and could not be safely represented on the
destination machine (ie loading the fp data would cause a denorm error).
They were very small numbers so we settled reluctantly on zero.

Taxpayers shouldn't be paying for crappy work.
How are your taxes paying for East Anglica?


Grants from the US government.


Huh?


International research funding means it is quite possible that CRU had
the odd US researcher there working on an NSF grant or part funded by
them. Same as UK researchers may visit US facilities to make
observations or conduct experiments. CRU is an internationally renowned
research centre (BTW it is in East Anglia).

A lot of CRU data is online at CISL at UCAR too. eg
http://dss.ucar.edu/datasets/ds579.0/

Many of the FOE enquiries are done for the sole purpose of harassing the
researchers and preventing them from doing their jobs. The same thing
happens to local councils and other institutions though more usually by
organised green campaigners in the UK.

That's far more expensive
than doing it right.
Define "right". So far, you've insisted that the public be the sanity
check. That won't work when with any science endeavor. In fact, it will
stop all science because the "rules" will eventually state that no
science work can be done unless the public approves.


You know, I try to be just as cynical as I can, but I think you have
me beat there.


I'm not being cynical. I'm stating what happens when this kind of fit
hits the shan.


There has been a failure to communicate the real science to the general
public though. The Exxon funded denialist think tanks have been allowed
to muddy the water for far too long without being properly challenged.

Scientists are not used to having to deal with public relations and
media spin so CRU and the University of East Anglia didn't make a good
fist of handling the initial enquiries about their data breach. The
first defence of the researchers was actually made by a researcher (not
a climate scientist at CRU) who did an eloquent job on BBC Radio 4 PM
against the express instructions of the University authorities who were
still hoping it would blow over.

The former UK Chief Science Advisor David King last night on BBC
Newsnight said that the hack against CRU was an extraordinary
sophisticated piece of work typical of a government agency. I didn't
think he was all that good in the interview and communicationg science
research to the public is a serious problem. People simply do not trust
scientists now and several guests made completely dishonest claims about
AGW based on what they have read online. These went unchallenged since
the scientists were not present for the audience discussion.

The Newsnight piece is online at:
http://news.bbc.co.uk/1/hi/programme...ht/8418356.stm

Unsure if you can watch it online outside of the UK.

That puts the
decisions right back into Congress and other legislative bodies. The
only countries who will be doing science are the communist-based
countries because they don't give a **** about public opinion at the
detail level.


Dictatorships are always more efficient in the short run. Long term,
freedom wins.


No, it doesn't. Unchecked freedom produces anarchies and destruction
of civilizations.
snip


Benign dictatorships are the most efficient, but unfortunately they do
not stay that way for long. Democracy is the least bad alternative
although it helps if you have at least three political parties. The US
style bipolar disorder in politics makes it impossible to avoid a
situation where if the Democrats are for something the Republicans are
automatically against it and vice-versa. A recipe for deadlock.

Sorry. I was referring to Stephen McIntyre of Climateaudit.org, who
has been trying to keep the Jones, Mann, Briffa et al gang honest for a
decade or so. I assumed you'd be familiar with him and his debunking
of the Hockey Stick icon. The leaked emails provide give insight into
the attitudes involved in trying to hide the data and algorithms from
him.
That is one place and one small group of people. Are you implying that
all groups in the world are similar?


In the climate area, yes. The IPCC is the dictator, and the US has
been following along, paying the bills. No serious discussion has
been allowed.


I haven't noticed that the articles I've read in _Science News_ were
all vetted by those people. How did the US pay those bills? Who
did they write the check to?


The IPCC collates the science and distils it into a summary form where
policy makers can understand it without having to read all the primary
literature. It is actually a well balanced piece of work and highlights
the uncertainties and areas still needing more research as well as the
conclusions that can be drawn from the existing data. Online at:

http://ipcc-wg1.ucar.edu/wg1/wg1-report.html

Have a look and see what you think. There are references into the
primary literature if you want to take it further.


If you're really not familiar with the issue, I'd recommend checking
out climateaudit.org. It might help you to understand the need to make
copies of raw data available online.
I don't need to do that; I am able to think.


Climateaudit will give you something to think about. It doesn't try
to substitute for your own thoughts.


Are you really suggesting that I not use analytical thinking and allow
somebody to do that work for me? Your attitude is part of the problem.


He starts from the result his politics insists must be right and then
looks for cherry picked data to support that position. Climateaudit is
one of those sites that provides the unthinking denialist with ammunition.

Which "leaded" emails. You're assuming conspiracies all over. Why so
seem to be suddenly surprised that their hypothesis about global warming
may not be valid?


Well, the leaked emails seem to provide evidence for the chicanery
we've all suspected. Some of it looks illegal, perhaps felonious.


I thought I've read here that they weren't leaked. Are you trying to
make a conspiracy out of the mess? That just makes more messes
and doesn't deal with the original.


They were hacked and by the sounds of it by a very professional team.
Unclear as yet whether it was a national security service or a loner
looking for UFOs (like the unfortunate McKinnon who is being extradited
to the USA as a terrorist for hacking secure DOD computers with
UID=guest/pw=guest etc.). I am inclined to think it is the DOD sysadmins
deserving the jail terms.

Regards,
Martin Brown
  #57   Report Post  
Old December 17th 09, 02:30 PM posted to alt.global-warming,alt.politics.libertarian,sci.geo.meteorology,sci.physics
external usenet poster
 
First recorded activity by Weather-Banter: Feb 2009
Posts: 197
Default Can Global Warming Predictions be Tested with Observations ofthe Real Climate System?

On Thu, 17 Dec 2009 08:20:38 -0500, jmfbahciv wrote:

Bill Ward wrote:
On Wed, 16 Dec 2009 07:53:38 -0500, jmfbahciv wrote:

Bill Ward wrote:
On Tue, 15 Dec 2009 08:22:09 -0500, jmfbahciv wrote:

Bill Ward wrote:
On Mon, 14 Dec 2009 09:57:19 -0500, jmfbahciv wrote:

Bill Ward wrote:
On Sat, 12 Dec 2009 09:53:43 -0500, jmfbahciv wrote:

Bill Ward wrote:
On Fri, 11 Dec 2009 09:03:37 -0500, jmfbahciv wrote:

Bill Ward wrote:

snip

I'm not disagreeing with that,
I'm just saying no matter who uses the data, it must be transcribed
into a usable format.
Then it is not the raw data. The suggestion was to provide the raw
data. That means that the original collection of bits has to be
copied bit for bit with no modification. A lot of copy operations
insert 0 zero bits for alignment.
I think the disagreement is simply semantic.
No, it's not.

You aren't considering
"copies of the raw data", "raw data". I do, as long as the copy is
not corrupt.
How do you know that? You can't unless the person who copied it did
an BINCOM or something to verify that no bits were changed and no
fills were inserted.


"Read after write" verification has been around quite a while. My
system does it automatically. Doesn't yours?


Does yours include the nulls when comparing one file to the other or
skip them? I'm stating that you have to be careful. If you're moving a
binary data file from a 16-bit to 72-bit machine, you'll have problems.
You'll have a lot more problems if you're moving a binary data file from
a 72-bit to 16-bit machine. You'll have even more problems if some of
the collection was done using single-word floating point format and
later collections was done using double-word floating point format.
Mixed mode data collections means that the raw data had better not be
something that had been modified and this included null fills.


Perhaps that explains the popularity of text files.

I call the original raw data, "the original raw data", while you
insist it's the only "raw data".
People are lazy and don't say "original raw data"; they say raw data.
The term has an implied meaning so that 5 hours of discussion doesn't
have to be done to clarify the meaning.

You sound like one of our bloody editors who insisted, until I raised
the roof, that all occurences of CPU in our documentation be changed
to central processing unit.


As I said, it's just semantics.


It is not just semantics. The terms we use in the computer biz implies
strict specifications. If they didn't, nothing was have gotten done.
The same thing happens in math and science when you say the word
derivative.


Not to mention finance. It's still semantics, just context sensitive.

If researchers are cutting corners on data integrity, posting it on
line would be one way to stop that. If they are doing it right,
then there should be no problems in making it available on line.


And I've been trying to tell you that it takes lots of time, money,
and babysitting to make that available. If you pass a rule that all
data has to be made public, resources, which would have normally
been used for the real research, will have to be used for
babysitting those bits. The real, science work will not get done.


Well, it wasn't posted, it was hidden, and now a lot of climate
"science" looks bogus. Perhaps reformatting, verifying and posting
copies of the raw data is a necessary part of the "real science".
Think of it as insurance.


But there existed other scientists who did not agree and said so.
That one place not providing their data when requested is not the
problem. Fixing them and their anal retention will not fix the real
problem.


What do you think is the "real problem", then? Can it be solved
without changing human nature?


Political corruption that is out of control. It can be solved but the
side effects are unappetizing.


It's been five thousand years and counting, yet we still have the same
problems. I suspect human nature has something to do with it. The
operating system has to fit the processors using it. We may be making
progress, but slowly.

snip

Taxpayers shouldn't be paying for crappy work.
How are your taxes paying for East Anglica?


Grants from the US government.


Huh?


Yup, check the unexpected data dump from CRU for more information.


That's far more expensive
than doing it right.
Define "right". So far, you've insisted that the public be the sanity
check. That won't work when with any science endeavor. In fact, it
will stop all science because the "rules" will eventually state that
no science work can be done unless the public approves.


You know, I try to be just as cynical as I can, but I think you have me
beat there.


I'm not being cynical. I'm stating what happens when this kind of fit
hits the shan.


That puts the
decisions right back into Congress and other legislative bodies. The
only countries who will be doing science are the communist-based
countries because they don't give a **** about public opinion at the
detail level.


Dictatorships are always more efficient in the short run. Long term,
freedom wins.


No, it doesn't. Unchecked freedom produces anarchies and destruction of
civilizations.


And who exactly is qualified to"check freedom"? Right now it seems "We
the People" is the best answer.

snip

Sorry. I was referring to Stephen McIntyre of Climateaudit.org, who
has been trying to keep the Jones, Mann, Briffa et al gang honest for
a decade or so. I assumed you'd be familiar with him and his
debunking of the Hockey Stick icon. The leaked emails provide give
insight into the attitudes involved in trying to hide the data and
algorithms from him.


That is one place and one small group of people. Are you implying
that all groups in the world are similar?


In the climate area, yes. The IPCC is the dictator, and the US has
been following along, paying the bills. No serious discussion has been
allowed.


I haven't noticed that the articles I've read in _Science News_ were all
vetted by those people. How did the US pay those bills?


They printed money, as usual.

Who did they write the check to?


Presumably the UEA, but I'm not really sure. It might have through the
UN and IPCC.

If you're really not familiar with the issue, I'd recommend checking
out climateaudit.org. It might help you to understand the need to
make copies of raw data available online.


I don't need to do that; I am able to think.


Climateaudit will give you something to think about. It doesn't try to
substitute for your own thoughts.


Are you really suggesting that I not use analytical thinking and allow
somebody to do that work for me?


Not really. I'm suggesting you're going off half-cocked, without enough
background on the AGW issue. ClimateAudit or WUWT would provide some
context for you to think about. Read RealClimate also, just for grins,
keeping in mind the emails show it was part of the scam.

Analytical thinking is only as good as the data to which you apply it.

Your attitude is part of the problem.


Yeah, everyone is always telling me that. You can see how effective that
is. ;-)

Which "leaded" emails. You're assuming conspiracies all over. Why so
seem to be suddenly surprised that their hypothesis about global
warming may not be valid?


Well, the leaked emails seem to provide evidence for the chicanery
we've all suspected. Some of it looks illegal, perhaps felonious.


I thought I've read here that they weren't leaked.


Don't believe everything you read here.

Are you trying to make a conspiracy out of the mess?


No, that would be the emails that were unexpectedly released. I'm just
pointing it out.

That just makes more messes and doesn't deal with the original.


Ignoring a scandal seldom makes it better. It just encourages the
scammers.


  #58   Report Post  
Old December 17th 09, 03:22 PM posted to alt.global-warming,alt.politics.libertarian,sci.geo.meteorology,sci.physics
external usenet poster
 
First recorded activity by Weather-Banter: Jul 2009
Posts: 438
Default Can Global Warming Predictions be Tested with Observations of the Real Climate System?

On Thu, 17 Dec 2009 14:20:43 +0000, Martin Brown
wrote:

jmfbahciv wrote:
Bill Ward wrote:
On Wed, 16 Dec 2009 07:53:38 -0500, jmfbahciv wrote:

Bill Ward wrote:
On Tue, 15 Dec 2009 08:22:09 -0500, jmfbahciv wrote:

Bill Ward wrote:
On Mon, 14 Dec 2009 09:57:19 -0500, jmfbahciv wrote:

Bill Ward wrote:
On Sat, 12 Dec 2009 09:53:43 -0500, jmfbahciv wrote:

Bill Ward wrote:
On Fri, 11 Dec 2009 09:03:37 -0500, jmfbahciv wrote:

Bill Ward wrote:

snip

I'm not disagreeing with that,
I'm just saying no matter who uses the data, it must be transcribed
into a usable format.
Then it is not the raw data. The suggestion was to provide the raw
data. That means that the original collection of bits has to be
copied bit for bit with no modification. A lot of copy operations
insert 0 zero bits for alignment.
I think the disagreement is simply semantic.
No, it's not.

You aren't considering
"copies of the raw data", "raw data". I do, as long as the copy is not
corrupt.
How do you know that? You can't unless the person who copied it did an
BINCOM or something to verify that no bits were changed and no fills
were inserted.

"Read after write" verification has been around quite a while. My
system does it automatically. Doesn't yours?


Does yours include the nulls when comparing one file to the other or
skip them? I'm stating that you have to be careful. If you're
moving a binary data file from a 16-bit to 72-bit machine, you'll
have problems. You'll have a lot more problems if you're moving
a binary data file from a 72-bit to 16-bit machine. You'll have
even more problems if some of the collection was done using
single-word floating point format and later collections was done using
double-word floating point format. Mixed mode data collections means
that the raw data had better not be something that had been modified
and this included null fills.


In the old days raw binary floating point was a nightmare to transport
since almost every manufacturer had a slightly different machine
representation before IEEE standardisation. I recall the pain and
suffering deciding what to do about certain states that could occur in
the original raw data and could not be safely represented on the
destination machine (ie loading the fp data would cause a denorm error).
They were very small numbers so we settled reluctantly on zero.

Taxpayers shouldn't be paying for crappy work.
How are your taxes paying for East Anglica?

Grants from the US government.


Huh?


International research funding means it is quite possible that CRU had
the odd US researcher there working on an NSF grant or part funded by
them. Same as UK researchers may visit US facilities to make
observations or conduct experiments. CRU is an internationally renowned
research centre (BTW it is in East Anglia).

A lot of CRU data is online at CISL at UCAR too. eg
http://dss.ucar.edu/datasets/ds579.0/

Many of the FOE enquiries are done for the sole purpose of harassing the
researchers and preventing them from doing their jobs. The same thing
happens to local councils and other institutions though more usually by
organised green campaigners in the UK.

That's far more expensive
than doing it right.
Define "right". So far, you've insisted that the public be the sanity
check. That won't work when with any science endeavor. In fact, it will
stop all science because the "rules" will eventually state that no
science work can be done unless the public approves.

You know, I try to be just as cynical as I can, but I think you have
me beat there.


I'm not being cynical. I'm stating what happens when this kind of fit
hits the shan.


There has been a failure to communicate the real science to the general
public though. The Exxon funded denialist think tanks have been allowed
to muddy the water for far too long without being properly challenged.

Scientists are not used to having to deal with public relations and
media spin so CRU and the University of East Anglia didn't make a good
fist of handling the initial enquiries about their data breach. The
first defence of the researchers was actually made by a researcher (not
a climate scientist at CRU) who did an eloquent job on BBC Radio 4 PM
against the express instructions of the University authorities who were
still hoping it would blow over.

The former UK Chief Science Advisor David King last night on BBC
Newsnight said that the hack against CRU was an extraordinary
sophisticated piece of work typical of a government agency. I didn't
think he was all that good in the interview and communicationg science
research to the public is a serious problem. People simply do not trust
scientists now and several guests made completely dishonest claims about
AGW based on what they have read online. These went unchallenged since
the scientists were not present for the audience discussion.

The Newsnight piece is online at:
http://news.bbc.co.uk/1/hi/programme...ht/8418356.stm

Unsure if you can watch it online outside of the UK.

That puts the
decisions right back into Congress and other legislative bodies. The
only countries who will be doing science are the communist-based
countries because they don't give a **** about public opinion at the
detail level.

Dictatorships are always more efficient in the short run. Long term,
freedom wins.


No, it doesn't. Unchecked freedom produces anarchies and destruction
of civilizations.
snip


Benign dictatorships are the most efficient, but unfortunately they do
not stay that way for long. Democracy is the least bad alternative
although it helps if you have at least three political parties. The US
style bipolar disorder in politics makes it impossible to avoid a
situation where if the Democrats are for something the Republicans are
automatically against it and vice-versa. A recipe for deadlock.

Sorry. I was referring to Stephen McIntyre of Climateaudit.org, who
has been trying to keep the Jones, Mann, Briffa et al gang honest for a
decade or so. I assumed you'd be familiar with him and his debunking
of the Hockey Stick icon. The leaked emails provide give insight into
the attitudes involved in trying to hide the data and algorithms from
him.
That is one place and one small group of people. Are you implying that
all groups in the world are similar?

In the climate area, yes. The IPCC is the dictator, and the US has
been following along, paying the bills. No serious discussion has
been allowed.


I haven't noticed that the articles I've read in _Science News_ were
all vetted by those people. How did the US pay those bills? Who
did they write the check to?


The IPCC collates the science and distils it into a summary form where
policy makers can understand it without having to read all the primary
literature. It is actually a well balanced piece of work and highlights
the uncertainties and areas still needing more research as well as the
conclusions that can be drawn from the existing data. Online at:

http://ipcc-wg1.ucar.edu/wg1/wg1-report.html

Have a look and see what you think. There are references into the
primary literature if you want to take it further.


If you're really not familiar with the issue, I'd recommend checking
out climateaudit.org. It might help you to understand the need to make
copies of raw data available online.
I don't need to do that; I am able to think.

Climateaudit will give you something to think about. It doesn't try
to substitute for your own thoughts.


Are you really suggesting that I not use analytical thinking and allow
somebody to do that work for me? Your attitude is part of the problem.


He starts from the result his politics insists must be right and then
looks for cherry picked data to support that position. Climateaudit is
one of those sites that provides the unthinking denialist with ammunition.

Which "leaded" emails. You're assuming conspiracies all over. Why so
seem to be suddenly surprised that their hypothesis about global warming
may not be valid?

Well, the leaked emails seem to provide evidence for the chicanery
we've all suspected. Some of it looks illegal, perhaps felonious.


I thought I've read here that they weren't leaked. Are you trying to
make a conspiracy out of the mess? That just makes more messes
and doesn't deal with the original.


They were hacked and by the sounds of it by a very professional team.
Unclear as yet whether it was a national security service or a loner
looking for UFOs (like the unfortunate McKinnon who is being extradited
to the USA as a terrorist for hacking secure DOD computers with
UID=guest/pw=guest etc.). I am inclined to think it is the DOD sysadmins
deserving the jail terms.

Regards,
Martin Brown


Your opinion is bizarre, there is no comparison
between military security and climate information.

Will all climate scientists please resign and
get a job doing something useful.








  #59   Report Post  
Old December 18th 09, 01:14 PM posted to alt.global-warming,alt.politics.libertarian,sci.geo.meteorology,sci.physics
external usenet poster
 
First recorded activity by Weather-Banter: Feb 2009
Posts: 59
Default Can Global Warming Predictions be Tested with Observations ofthe Real Climate System?

Martin Brown wrote:
jmfbahciv wrote:
Bill Ward wrote:
On Wed, 16 Dec 2009 07:53:38 -0500, jmfbahciv wrote:

Bill Ward wrote:
On Tue, 15 Dec 2009 08:22:09 -0500, jmfbahciv wrote:

Bill Ward wrote:
On Mon, 14 Dec 2009 09:57:19 -0500, jmfbahciv wrote:

Bill Ward wrote:
On Sat, 12 Dec 2009 09:53:43 -0500, jmfbahciv wrote:

Bill Ward wrote:
On Fri, 11 Dec 2009 09:03:37 -0500, jmfbahciv wrote:

Bill Ward wrote:

snip

I'm not disagreeing with that,
I'm just saying no matter who uses the data, it must be transcribed
into a usable format.
Then it is not the raw data. The suggestion was to provide the raw
data. That means that the original collection of bits has to be
copied bit for bit with no modification. A lot of copy operations
insert 0 zero bits for alignment.
I think the disagreement is simply semantic.
No, it's not.

You aren't considering
"copies of the raw data", "raw data". I do, as long as the copy is
not
corrupt.
How do you know that? You can't unless the person who copied it did an
BINCOM or something to verify that no bits were changed and no fills
were inserted.

"Read after write" verification has been around quite a while. My
system does it automatically. Doesn't yours?


Does yours include the nulls when comparing one file to the other or
skip them? I'm stating that you have to be careful. If you're
moving a binary data file from a 16-bit to 72-bit machine, you'll
have problems. You'll have a lot more problems if you're moving
a binary data file from a 72-bit to 16-bit machine. You'll have
even more problems if some of the collection was done using
single-word floating point format and later collections was done using
double-word floating point format. Mixed mode data collections means
that the raw data had better not be something that had been modified
and this included null fills.


In the old days raw binary floating point was a nightmare to transport
since almost every manufacturer had a slightly different machine
representation before IEEE standardisation. I recall the pain and
suffering deciding what to do about certain states that could occur in
the original raw data and could not be safely represented on the
destination machine (ie loading the fp data would cause a denorm error).
They were very small numbers so we settled reluctantly on zero.



In those kinds of cases, the raw data had to be not-modified at all.
A lot of networking software would zero-fill. OSes, which were
RMS-based, could wreak havoc with any file that was copied from,
to or through it. I'm extremely concerned that quite a few people
here think that any old set of data can be thrown onto a server
with no human caring over time, and the file(s) would represent
accurate raw data.

Taxpayers shouldn't be paying for crappy work.
How are your taxes paying for East Anglica?

Grants from the US government.


Huh?


International research funding means it is quite possible that CRU had
the odd US researcher there working on an NSF grant or part funded by
them.


But the US taxpayer did not provide all of the funding, which is
what [whathisname] wanted to imply.

Same as UK researchers may visit US facilities to make
observations or conduct experiments. CRU is an internationally renowned
research centre (BTW it is in East Anglia).


Thanks for the spelling correction. I appreciate it :-)


A lot of CRU data is online at CISL at UCAR too. eg
http://dss.ucar.edu/datasets/ds579.0/

Many of the FOE enquiries are done for the sole purpose of harassing the
researchers and preventing them from doing their jobs. The same thing
happens to local councils and other institutions though more usually by
organised green campaigners in the UK.


So why don't the conspiracy nuts start foaming at the mouth about these
people whose goal is to prevent all useful work being done? That's one
of the ironies I don't understand in the real world.


That's far more expensive
than doing it right.
Define "right". So far, you've insisted that the public be the sanity
check. That won't work when with any science endeavor. In fact, it
will
stop all science because the "rules" will eventually state that no
science work can be done unless the public approves.

You know, I try to be just as cynical as I can, but I think you have
me beat there.


I'm not being cynical. I'm stating what happens when this kind of fit
hits the shan.


There has been a failure to communicate the real science to the general
public though. The Exxon funded denialist think tanks have been allowed
to muddy the water for far too long without being properly challenged.


It wouldn't matter iff scam artists such as Al Gore didn't get into the
mix. Using a presidential party platform is one of the tactics to make
big huge messes.


Scientists are not used to having to deal with public relations and
media spin so CRU and the University of East Anglia didn't make a good
fist of handling the initial enquiries about their data breach. The
first defence of the researchers was actually made by a researcher (not
a climate scientist at CRU) who did an eloquent job on BBC Radio 4 PM
against the express instructions of the University authorities who were
still hoping it would blow over.

The former UK Chief Science Advisor David King last night on BBC
Newsnight said that the hack against CRU was an extraordinary
sophisticated piece of work typical of a government agency. I didn't
think he was all that good in the interview and communicationg science
research to the public is a serious problem. People simply do not trust
scientists now


But the general public trusts the politicians. That makes no sense
to me.

and several guests made completely dishonest claims about
AGW based on what they have read online. These went unchallenged since
the scientists were not present for the audience discussion.


Which makes me smell the bias scent of the BBC.


The Newsnight piece is online at:
http://news.bbc.co.uk/1/hi/programme...ht/8418356.stm

Unsure if you can watch it online outside of the UK.

That puts the
decisions right back into Congress and other legislative bodies. The
only countries who will be doing science are the communist-based
countries because they don't give a **** about public opinion at the
detail level.

Dictatorships are always more efficient in the short run. Long term,
freedom wins.


No, it doesn't. Unchecked freedom produces anarchies and destruction
of civilizations.
snip


Benign dictatorships are the most efficient, but unfortunately they do
not stay that way for long.


That's because humans are put in charge :-). The meanest sociopath
acquires the power within 2 decades of work.

Democracy is the least bad alternative


Democracy is not 100% freedom; it is a mixture of freedom and equality.
The one rein checks the other.

although it helps if you have at least three political parties. The US
style bipolar disorder in politics makes it impossible to avoid a
situation where if the Democrats are for something the Republicans are
automatically against it and vice-versa. A recipe for deadlock.


Which is a feature.


Sorry. I was referring to Stephen McIntyre of Climateaudit.org, who
has been trying to keep the Jones, Mann, Briffa et al gang honest
for a
decade or so. I assumed you'd be familiar with him and his debunking
of the Hockey Stick icon. The leaked emails provide give insight into
the attitudes involved in trying to hide the data and algorithms from
him.
That is one place and one small group of people. Are you implying that
all groups in the world are similar?

In the climate area, yes. The IPCC is the dictator, and the US has
been following along, paying the bills. No serious discussion has
been allowed.


I haven't noticed that the articles I've read in _Science News_ were
all vetted by those people. How did the US pay those bills? Who
did they write the check to?


The IPCC collates the science and distills it into a summary form where
policy makers can understand it without having to read all the primary
literature. It is actually a well balanced piece of work and highlights
the uncertainties and areas still needing more research as well as the
conclusions that can be drawn from the existing data. Online at:

http://ipcc-wg1.ucar.edu/wg1/wg1-report.html

Have a look and see what you think. There are references into the
primary literature if you want to take it further.


So the demand made within this thread was smoke. I'll try to get
to the library and take a look at it.



If you're really not familiar with the issue, I'd recommend checking
out climateaudit.org. It might help you to understand the need to
make
copies of raw data available online.
I don't need to do that; I am able to think.

Climateaudit will give you something to think about. It doesn't try
to substitute for your own thoughts.


Are you really suggesting that I not use analytical thinking and allow
somebody to do that work for me? Your attitude is part of the problem.


He starts from the result his politics insists must be right and then
looks for cherry picked data to support that position. Climateaudit is
one of those sites that provides the unthinking denialist with ammunition.

Which "leaded" emails. You're assuming conspiracies all over. Why so
seem to be suddenly surprised that their hypothesis about global
warming
may not be valid?

Well, the leaked emails seem to provide evidence for the chicanery
we've all suspected. Some of it looks illegal, perhaps felonious.


I thought I've read here that they weren't leaked. Are you trying to
make a conspiracy out of the mess? That just makes more messes
and doesn't deal with the original.


They were hacked and by the sounds of it by a very professional team.


New hard/software was also getting released within the same time frame.

Unclear as yet whether it was a national security service or a loner
looking for UFOs (like the unfortunate McKinnon who is being extradited
to the USA as a terrorist for hacking secure DOD computers with
UID=guest/pw=guest etc.). I am inclined to think it is the DOD sysadmins
deserving the jail terms.


I've been working with systems since the late 60s. Security is
extremely difficult to maintain and the OS, which whose primary
goal was 100% security, isn't available as the primary OS
anymore.

/BAH
  #60   Report Post  
Old December 18th 09, 01:16 PM posted to alt.global-warming,alt.politics.libertarian,sci.geo.meteorology,sci.physics
external usenet poster
 
First recorded activity by Weather-Banter: Feb 2009
Posts: 59
Default Can Global Warming Predictions be Tested with Observations ofthe Real Climate System?

Bill Ward wrote:
On Thu, 17 Dec 2009 08:20:38 -0500, jmfbahciv wrote:

Bill Ward wrote:
On Wed, 16 Dec 2009 07:53:38 -0500, jmfbahciv wrote:

Bill Ward wrote:
On Tue, 15 Dec 2009 08:22:09 -0500, jmfbahciv wrote:

Bill Ward wrote:
On Mon, 14 Dec 2009 09:57:19 -0500, jmfbahciv wrote:

Bill Ward wrote:
On Sat, 12 Dec 2009 09:53:43 -0500, jmfbahciv wrote:

Bill Ward wrote:
On Fri, 11 Dec 2009 09:03:37 -0500, jmfbahciv wrote:

Bill Ward wrote:

snip

I'm not disagreeing with that,
I'm just saying no matter who uses the data, it must be transcribed
into a usable format.
Then it is not the raw data. The suggestion was to provide the raw
data. That means that the original collection of bits has to be
copied bit for bit with no modification. A lot of copy operations
insert 0 zero bits for alignment.
I think the disagreement is simply semantic.
No, it's not.

You aren't considering
"copies of the raw data", "raw data". I do, as long as the copy is
not corrupt.
How do you know that? You can't unless the person who copied it did
an BINCOM or something to verify that no bits were changed and no
fills were inserted.
"Read after write" verification has been around quite a while. My
system does it automatically. Doesn't yours?

Does yours include the nulls when comparing one file to the other or
skip them? I'm stating that you have to be careful. If you're moving a
binary data file from a 16-bit to 72-bit machine, you'll have problems.
You'll have a lot more problems if you're moving a binary data file from
a 72-bit to 16-bit machine. You'll have even more problems if some of
the collection was done using single-word floating point format and
later collections was done using double-word floating point format.
Mixed mode data collections means that the raw data had better not be
something that had been modified and this included null fills.


Perhaps that explains the popularity of text files.


Which cannot be used as data. Period.


I call the original raw data, "the original raw data", while you
insist it's the only "raw data".
People are lazy and don't say "original raw data"; they say raw data.
The term has an implied meaning so that 5 hours of discussion doesn't
have to be done to clarify the meaning.

You sound like one of our bloody editors who insisted, until I raised
the roof, that all occurences of CPU in our documentation be changed
to central processing unit.
As I said, it's just semantics.

It is not just semantics. The terms we use in the computer biz implies
strict specifications. If they didn't, nothing was have gotten done.
The same thing happens in math and science when you say the word
derivative.


Not to mention finance. It's still semantics, just context sensitive.


You're nuts. I'm giving up trying to talk about this with you. The
subject deserves serious, careful thought.

snip...very reluctantly

/BAH


Reply
Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules

Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are On
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
Rain finally arrives in S.Essex due to a tried and tested predictionmethod. Dave Cornwell[_4_] uk.sci.weather (UK Weather) 3 August 13th 15 05:04 PM
Ancient climate records 'back predictions' Climate sensitivitysimilar in past warmings Dawlish uk.sci.weather (UK Weather) 0 February 5th 15 01:27 PM
Models may be Overestimating Global Warming Predictions David[_4_] sci.geo.meteorology (Meteorology) 5 November 21st 08 09:11 PM
Weather Eye: Old-timers' tales tell story of global warming -- Climate change observations from a professional observer. Psalm 110 sci.geo.meteorology (Meteorology) 0 August 23rd 04 06:53 AM
Rubber Duckies Can Save The World ..... Can Solve Global Warming or Cooling KCC alt.talk.weather (General Weather Talk) 2 January 19th 04 12:12 PM


All times are GMT. The time now is 07:23 PM.

Powered by vBulletin® Copyright ©2000 - 2025, Jelsoft Enterprises Ltd.
Copyright ©2004-2025 Weather Banter.
The comments are property of their posters.
 

About Us

"It's about Weather"

 

Copyright © 2017