Multimedia
Oct 12, 12:00 PM
The one I ordered the other day shipped yesterday and I'm expecting delivery on monday. I requested the forum coupon and will see if they will credit me. But I don't know. i'm not planning on going through the brain damage of ordering another monitor with the coupon and sending one back just to save ~$100.
I currently have a 30" Dell that I bought last year when Dell first introduced them. I love the thing... My only gripe is 1 stuck pixel, but Dell requires like 7 or more to replace and I didn't swap the monitor within my 30-day window because the pixel didn't show up until after nearly 3 months. :(
I have an Apple 30" on my other G5 quad and I've never had the two side by side, but I think I like the Dell one better. I use a Gefen 4x1 DVI-DL switcher and have the G5 and two PC systems connected to the Dell with an extra cable for my MBP or whatnot if I want to connect that. I ordered the second 30" because I'm going to expand my desktop to dual 30" displays. :D I had to order another Gefen switcher for the second monitor too since the G5 and one of my PC boxes both support dual-link DVI out of both DVI ports as will the Mac Pro I'm planning to buy in the near future.Wow I didn't even know such an accessory existed:
Gefen 4x1 DVI DL Switcher (Parallel Control) $899 (http://www.gefen.com/kvm/product.jsp?prod_id=3499)
But the price is almost that of another screen! Holy Moly. You have a better place to buy it for less with link please?
So you gonna go with the ATI Dual Dual Link DVI Card on your Mac Pro? What card do you have in your Quad. I bought mine refurb and Apple doesn't sell a Dual Dual Link video card for it for post-purchase upgrade that I know of. Do you? Could just buy another cheap NVIDEA GeForce 6600 card that is missing the noisy fan. Don't do 3-D or games.
I currently have a 30" Dell that I bought last year when Dell first introduced them. I love the thing... My only gripe is 1 stuck pixel, but Dell requires like 7 or more to replace and I didn't swap the monitor within my 30-day window because the pixel didn't show up until after nearly 3 months. :(
I have an Apple 30" on my other G5 quad and I've never had the two side by side, but I think I like the Dell one better. I use a Gefen 4x1 DVI-DL switcher and have the G5 and two PC systems connected to the Dell with an extra cable for my MBP or whatnot if I want to connect that. I ordered the second 30" because I'm going to expand my desktop to dual 30" displays. :D I had to order another Gefen switcher for the second monitor too since the G5 and one of my PC boxes both support dual-link DVI out of both DVI ports as will the Mac Pro I'm planning to buy in the near future.Wow I didn't even know such an accessory existed:
Gefen 4x1 DVI DL Switcher (Parallel Control) $899 (http://www.gefen.com/kvm/product.jsp?prod_id=3499)
But the price is almost that of another screen! Holy Moly. You have a better place to buy it for less with link please?
So you gonna go with the ATI Dual Dual Link DVI Card on your Mac Pro? What card do you have in your Quad. I bought mine refurb and Apple doesn't sell a Dual Dual Link video card for it for post-purchase upgrade that I know of. Do you? Could just buy another cheap NVIDEA GeForce 6600 card that is missing the noisy fan. Don't do 3-D or games.
AriX
May 2, 09:40 AM
haven't seen this malware first hand, but a zip file can be made with absolute paths, making "unzipping" the file put everything where it needs to be to start up automatically on next log in/reboot.
Who's the brainiac who made zip files "safe" ?
Archive Utility will not extract these type of ZIP files to their system paths. I believe it will force the use of relative paths. I really doubt any reports that this malware can be installed without user interaction.
Who's the brainiac who made zip files "safe" ?
Archive Utility will not extract these type of ZIP files to their system paths. I believe it will force the use of relative paths. I really doubt any reports that this malware can be installed without user interaction.
Howdr
Mar 18, 08:35 AM
OMG you still done get it:
Let's try explaining it this way...
When you subscribe to cable, you pick a package that provides you with the channels that you want. There are various packages, but ultimately it's all just video streaming over a cable (bits in this day and age, not analog)...
Based on yours and others arguements, why can't we all just pay for basic cable and get all 500+ channels plus the premium channels for free? Very simply, you're paying for a package with specific features....
No no, as long as you abide by the amount of data in the plan it should not matter how you use it.
You can't steal what you paid for, you buy 100 cable channels that is what you get and use
You buy 2gb and use 1gb you have used 1gb no matter if its on the phone or laptop. 1gb= 1gb
With your cellular service, you chose a package that meets your needs. You have 3 options for data plans at this point, well, 4 technically...
1) Your grandfathered unlimited plan
2) 250mb
3) Data Pro 2GB
4) Data Pro 2GB + Tethering 2GB for a total of 4GB....
Ok? the tethering give you 2gb for the money I see that and I have read the tethering and Data pro are added to total 4gb for the charge. So you and At&t prove my point thank you! Data=Data, they add it together and it is the same.
Tethering is not the same as using the data on your device, essentially tethering is using your phone as a modem. You data plan (which I'm assuming is either unlimited or 250mb) does not include the feature of using your phone as a modem, that's what the extra charge is for....
If you want to tether, you need to pay for the appropriate package. Just like if you want HBO, Showtime, or HDTV you need to pay for the appropriate cable package...
LOL no its the same use of Data as on the phone.
Tethering does not do something different to AT&t, its just using Data
you may not understand how Data is used from the source but I assure you there is no difference to AT&t when you tether and when you surf YOUTUBE on the phone.
To At&t Data=Data and its been their words not mine every time its printed by them.
So far I have not seen an argument that proves otherwise.:rolleyes:
Let's try explaining it this way...
When you subscribe to cable, you pick a package that provides you with the channels that you want. There are various packages, but ultimately it's all just video streaming over a cable (bits in this day and age, not analog)...
Based on yours and others arguements, why can't we all just pay for basic cable and get all 500+ channels plus the premium channels for free? Very simply, you're paying for a package with specific features....
No no, as long as you abide by the amount of data in the plan it should not matter how you use it.
You can't steal what you paid for, you buy 100 cable channels that is what you get and use
You buy 2gb and use 1gb you have used 1gb no matter if its on the phone or laptop. 1gb= 1gb
With your cellular service, you chose a package that meets your needs. You have 3 options for data plans at this point, well, 4 technically...
1) Your grandfathered unlimited plan
2) 250mb
3) Data Pro 2GB
4) Data Pro 2GB + Tethering 2GB for a total of 4GB....
Ok? the tethering give you 2gb for the money I see that and I have read the tethering and Data pro are added to total 4gb for the charge. So you and At&t prove my point thank you! Data=Data, they add it together and it is the same.
Tethering is not the same as using the data on your device, essentially tethering is using your phone as a modem. You data plan (which I'm assuming is either unlimited or 250mb) does not include the feature of using your phone as a modem, that's what the extra charge is for....
If you want to tether, you need to pay for the appropriate package. Just like if you want HBO, Showtime, or HDTV you need to pay for the appropriate cable package...
LOL no its the same use of Data as on the phone.
Tethering does not do something different to AT&t, its just using Data
you may not understand how Data is used from the source but I assure you there is no difference to AT&t when you tether and when you surf YOUTUBE on the phone.
To At&t Data=Data and its been their words not mine every time its printed by them.
So far I have not seen an argument that proves otherwise.:rolleyes:
kuwisdelu
Apr 12, 10:57 PM
I don't claim to know anything at all about professional video editing. I only listened to the live feed. And I can say that the FCP pros at NAB sounded like teenage girls at a Justin Bieber concert.
So I'm going to assume it's good.
So I'm going to assume it's good.
AidenShaw
Oct 8, 07:54 AM
By Quad you mean each slower Clovertown or a pair of faster Woodies?
I meant quad-core package (socket) - be it Clovertown/Woodcrest or Kentsfield/Conroe.
On a multi-threaded workflow, twice as many somewhat slower threads are better than half as many somewhat faster threads.
Of course, many desktop applications can't use four cores (or 8), and many feel "snappier" with fewer, faster cores.
_______________
In one demo at IDF, Intel showed a dual Woodie against the top Opteron.
The Woody was about 60% faster, using 80% of the power.
On stage, they swapped the Woodies with low-voltage Clovertowns which matched the power envelope of the Woodies that they removed. I think they said that the Clovertowns were 800 MHz slower than the Woodies.
With the Clovertowns, the system was 20% faster than the Woodies (even at 800 MHz slower per core), at almost exactly the same wattage (1 or 2 watts more). This made it 95% faster than the Opterons, still at 80% of the power draw.
You can see the demo at http://www.intel.com/idf/us/fall2006/webcast.htm - look for Gelsinger's keynote the second day.
I meant quad-core package (socket) - be it Clovertown/Woodcrest or Kentsfield/Conroe.
On a multi-threaded workflow, twice as many somewhat slower threads are better than half as many somewhat faster threads.
Of course, many desktop applications can't use four cores (or 8), and many feel "snappier" with fewer, faster cores.
_______________
In one demo at IDF, Intel showed a dual Woodie against the top Opteron.
The Woody was about 60% faster, using 80% of the power.
On stage, they swapped the Woodies with low-voltage Clovertowns which matched the power envelope of the Woodies that they removed. I think they said that the Clovertowns were 800 MHz slower than the Woodies.
With the Clovertowns, the system was 20% faster than the Woodies (even at 800 MHz slower per core), at almost exactly the same wattage (1 or 2 watts more). This made it 95% faster than the Opterons, still at 80% of the power draw.
You can see the demo at http://www.intel.com/idf/us/fall2006/webcast.htm - look for Gelsinger's keynote the second day.
Thunderhawks
Apr 21, 07:24 AM
Wondering why Android users are on a Mac forum?
The discussion of who has the better device is useless.
Whatever works for you is fine. Whatever works for me is fine.
The day something really good comes out on either platform the media will report it , we will see advertising and we can read reviews and check things out and decide what to buy next.
Do I feel ghz or chip envy about standby time, camera resolution mp, or app availability?
Couldn't care less, if my device does what I want it to do.
So, Android guys, you have the best device if you decide so.
No need to look at what Apple does. It will come to your device too, just a little later when the copies are ready.
The discussion of who has the better device is useless.
Whatever works for you is fine. Whatever works for me is fine.
The day something really good comes out on either platform the media will report it , we will see advertising and we can read reviews and check things out and decide what to buy next.
Do I feel ghz or chip envy about standby time, camera resolution mp, or app availability?
Couldn't care less, if my device does what I want it to do.
So, Android guys, you have the best device if you decide so.
No need to look at what Apple does. It will come to your device too, just a little later when the copies are ready.
Blue Velvet
Mar 12, 03:46 AM
The main island of Japan, the complete land mass, has moved sideways by eight feet (about 2.5 metres). And the earth, the entire planet, has shifted on its axis by about four inches (10cm)... according to geophysicists reported over at CNN. (http://edition.cnn.com/2011/WORLD/asiapcf/03/12/japan.earthquake.tsunami.earth/index.html)
dgree03
Apr 28, 02:06 PM
By the "real world" you are ignoring the vast majority of users who need nothing like the power of a standard desktop today, and won't need software requiring a decacore processor in 10 years. Power users will always have PCs. The other 90% of humanity will do the majority of their work on tablets.
Software might not need that powerful of a processor, but what about OS? Heck Itunes shutters on my bros 2008 Macbook Pro, which is basic software. Flash can barely run on his computer also.
Software might not need that powerful of a processor, but what about OS? Heck Itunes shutters on my bros 2008 Macbook Pro, which is basic software. Flash can barely run on his computer also.
Liquorpuki
Mar 14, 06:20 PM
I beg to differ: your electricity consumption is shocking too. It's all that AC. We Brits always made do with punkah wallahs. Useful local employment opportunities and saves on polluting the atmosphere, too. You have a ready supply of "illegals" who would jump at the chance.
Then you're probably more shocked at the Canadians, Norwegians, and Swedes, who consume more power per person than Americans do. Iceland consumes twice as much per person than us. And they don't even use AC.
Then you're probably more shocked at the Canadians, Norwegians, and Swedes, who consume more power per person than Americans do. Iceland consumes twice as much per person than us. And they don't even use AC.
HecubusPro
Sep 12, 06:25 PM
I am dying to see what this thing looks like. Does anyone have an image of it?
Please?!
http://www.gizmodo.com/assets/resources/2006/09/img3679.jpg
Please?!
http://www.gizmodo.com/assets/resources/2006/09/img3679.jpg
nick9191
Apr 22, 11:44 PM
I disagree.
For a start atheism (ass I see it) is not a belief system, I don't even like to use the term atheist because it grants religion(s) a much higher status than I think it deserves. The term atheism gives the impression that I have purposefully decided NOT to believe in god or religion
I have not chosen not to believe in god or god(s). I just have no reason to believe that they exist because I have seen nothing which suggests their existence.
I don't claim to understand how the universe/matter/energy/life came to be, but the ancient Greeks didn't understand lighting. The fact that they didn't understand lighting made Zeus no more real and electricity no less real. The fact that I do not understand abiogenesis (the formation of living matter from non living matter) does not mean that it is beyond understanding.
The fact that there is much currently beyond the scope of human understanding in no way suggests the existence of god.
In much the same way that one's inability to see through a closed door doesn't suggest that the room beyond is filled with leprechauns.
A lack of information does not arbitrarily suggest the nature of the lacking knowledge. Any speculation which isn't based upon available information is simply meaningless speculation, nothing more.
I don't think atheism is a belief system, but it requires belief. Not believing in a god requires believing there isn't a god. You could say I'm just twisting words there.
I agree on all your points. I just can't bring myself to completely deny the existence of god, not through fear, but through fear.. of insulting my own intelligence. We can't prove god exists or doesn't exist, it seems impossible that we ever will. So I don't deny the existence of god, I do think it's unlikely and illogical, hence why I lean towards atheism (agnostic atheist).
For a start atheism (ass I see it) is not a belief system, I don't even like to use the term atheist because it grants religion(s) a much higher status than I think it deserves. The term atheism gives the impression that I have purposefully decided NOT to believe in god or religion
I have not chosen not to believe in god or god(s). I just have no reason to believe that they exist because I have seen nothing which suggests their existence.
I don't claim to understand how the universe/matter/energy/life came to be, but the ancient Greeks didn't understand lighting. The fact that they didn't understand lighting made Zeus no more real and electricity no less real. The fact that I do not understand abiogenesis (the formation of living matter from non living matter) does not mean that it is beyond understanding.
The fact that there is much currently beyond the scope of human understanding in no way suggests the existence of god.
In much the same way that one's inability to see through a closed door doesn't suggest that the room beyond is filled with leprechauns.
A lack of information does not arbitrarily suggest the nature of the lacking knowledge. Any speculation which isn't based upon available information is simply meaningless speculation, nothing more.
I don't think atheism is a belief system, but it requires belief. Not believing in a god requires believing there isn't a god. You could say I'm just twisting words there.
I agree on all your points. I just can't bring myself to completely deny the existence of god, not through fear, but through fear.. of insulting my own intelligence. We can't prove god exists or doesn't exist, it seems impossible that we ever will. So I don't deny the existence of god, I do think it's unlikely and illogical, hence why I lean towards atheism (agnostic atheist).
munkery
May 2, 05:41 PM
What is "an installer" but an executable file and what prevents me from writing "an installer" that does more than just "installing".
My response, why bother worrying about this when the attacker can do the same thing via shellcode generated in the background by exploiting a running process so the the user is unaware that code is being executed on the system.
I don't know of any Javascript DOM manipulation that lets you have write/read access to the local filesystem. This is already sandboxed.
The scripting engine in the current Safari is not yet sandboxed.
Here is a list of Javascript vulnerabilities:
http://cve.mitre.org/cgi-bin/cvekey.cgi?keyword=Mac+OS+X+Javascript
The issue is Safari is launching an executable file that sits outside the browser sandbox.
In the current Safari, only some plugins are sandboxed, so this wasn't execution outside the sandbox.
All that having been said, UAC has really evened the bar for Windows Vista and 7 (moreso in 7 after the usability tweaks Microsoft put in to stop people from disabling it). I see no functional security difference between the OS X authorization scheme and the Windows UAC scheme.
Except this:
Switching off or turning down UAC in Windows also equally impacts the strength of MIC (Windows sandboxing mechanism) because it functions based on inherited permissions. Unix DAC in Mac OS X functions via inherited permissions but MAC (mandatory access controls -> OS X sandbox) does not. Windows does not have a sandbox like OS X.
UAC, by default, does not use a unique identifier (password) so it is more susceptible to attacks the rely on spoofing prompts that appear to be unrelated to UAC to steal authentication. If a password is attached to authentication, these spoofed prompts fail to work.
Unix DAC is turned off in OS X in the root user account.
My response, why bother worrying about this when the attacker can do the same thing via shellcode generated in the background by exploiting a running process so the the user is unaware that code is being executed on the system.
I don't know of any Javascript DOM manipulation that lets you have write/read access to the local filesystem. This is already sandboxed.
The scripting engine in the current Safari is not yet sandboxed.
Here is a list of Javascript vulnerabilities:
http://cve.mitre.org/cgi-bin/cvekey.cgi?keyword=Mac+OS+X+Javascript
The issue is Safari is launching an executable file that sits outside the browser sandbox.
In the current Safari, only some plugins are sandboxed, so this wasn't execution outside the sandbox.
All that having been said, UAC has really evened the bar for Windows Vista and 7 (moreso in 7 after the usability tweaks Microsoft put in to stop people from disabling it). I see no functional security difference between the OS X authorization scheme and the Windows UAC scheme.
Except this:
Switching off or turning down UAC in Windows also equally impacts the strength of MIC (Windows sandboxing mechanism) because it functions based on inherited permissions. Unix DAC in Mac OS X functions via inherited permissions but MAC (mandatory access controls -> OS X sandbox) does not. Windows does not have a sandbox like OS X.
UAC, by default, does not use a unique identifier (password) so it is more susceptible to attacks the rely on spoofing prompts that appear to be unrelated to UAC to steal authentication. If a password is attached to authentication, these spoofed prompts fail to work.
Unix DAC is turned off in OS X in the root user account.
hexonxonx
Jun 13, 06:25 PM
me too. It's been a lot worse recently. I always said AT&T was fine, but I'm being made to look like a liar. Why are we going in the wrong direction here?
It's gotten allot better for us since September when they announced 850MHz or whatever that is. I think I have only had one dropped call in all these months. Our download speed have also increased to just under 3Mbps. :)
It's gotten allot better for us since September when they announced 850MHz or whatever that is. I think I have only had one dropped call in all these months. Our download speed have also increased to just under 3Mbps. :)
iJohnHenry
Apr 23, 03:54 PM
You don't understand because you can't see the big picture.
You have to step back, in order to see the big picture.
He could be standing in the middle of the Andromeda galaxy, and it would be of no value.
I think ancient Jews thought each day began at dawn and ended at sunset.
So, all biblical days are Solar days?
Perhaps God goes by a much longer passage of time for His days. ;)
You have to step back, in order to see the big picture.
He could be standing in the middle of the Andromeda galaxy, and it would be of no value.
I think ancient Jews thought each day began at dawn and ended at sunset.
So, all biblical days are Solar days?
Perhaps God goes by a much longer passage of time for His days. ;)
KnightWRX
May 2, 04:11 PM
No one is pointing fingers or bickering. I'm responding to your question. The only technical requirement that was satisfied is that the user had "Open "safe" files after downloading" selected. An app installer is not unsafe. Whether the app to be installed is safe or not is another matter, but the installer cannot harm your system or your user files, simply by launching. If you don't want apps... installers or otherwise... to launch after downloading, simply deselect that box.
Wait, the "Open Safe files" bit was for the zip archive, which runs it through Archive Utility. What then auto-executes an installer ? You're suggesting Safari somehow knows that the zip archive contains an installer and that it is indeed an installer and then executes it.
Do you have any proof of this ? I've been trying to get my hands on the zip archive itself to inspect it but no luck, as Google is now swamped with "news" about this thing that just rehashes what you just said.
Basically, the details you provide here are nothing I already don't know about the current situation, I am asking for more here. Not just "deselect" that box, but rather what else can be auto-executes and what else is considered "safe".
I don't use Safari, I'm not at risk, but I'd still like to know the details of this.
That's why I say you purposefully ignore my point. My point is let's dissect and understand this thing, not glance over it like the current news outlet, heck even Intego's description does. That's why I don't like Intego, they just spread FUD without ever explaining anything and mark everything as a "virus" (their Virus X-barrier says VIRUS FOUND! when it finds malware that isn't a virus...).
1. First, the file would need to be considered "safe" to be allowed to auto-download and auto-open, AND the browser would need to be set to allow this.
2. Then, like the case with the installer above, it would need to seek the user's permission to be installed. This again, required the complicity of the user, who would still need the administrator's password.
How can anything be considered safe in this scenario ? We have a compressed archive and an executable file. Both are rather unsafe. Especially the executable file. I don't care that it is an installer, no executable file is safe. What if the "installer" had some payload code on launch, before privilege escalation ?
This is what I'm interested in knowing, how is this thing packaged so that it gets auto-executed. You aren't answering my question either. I'm technical enough I think that I already understood what you and the Studios guy are "trying to explain to me", but you both fail to understand the underlying question :
Why is this thing auto-executing ? I know it's because Safari considers it safe since the user checked the safe box, that's in the article. I want to know why is an executable file being launched after a zip file was uncompressed and how does Safari know this is "safe" ?
Both of you are only repeating the same stuff that's in the media. I want the details, not the media overview. I want the archive itself if possible. Let's find it, dissect it, understand it. If Apple needs to modify some defaults, let's ask for that.
Wait, the "Open Safe files" bit was for the zip archive, which runs it through Archive Utility. What then auto-executes an installer ? You're suggesting Safari somehow knows that the zip archive contains an installer and that it is indeed an installer and then executes it.
Do you have any proof of this ? I've been trying to get my hands on the zip archive itself to inspect it but no luck, as Google is now swamped with "news" about this thing that just rehashes what you just said.
Basically, the details you provide here are nothing I already don't know about the current situation, I am asking for more here. Not just "deselect" that box, but rather what else can be auto-executes and what else is considered "safe".
I don't use Safari, I'm not at risk, but I'd still like to know the details of this.
That's why I say you purposefully ignore my point. My point is let's dissect and understand this thing, not glance over it like the current news outlet, heck even Intego's description does. That's why I don't like Intego, they just spread FUD without ever explaining anything and mark everything as a "virus" (their Virus X-barrier says VIRUS FOUND! when it finds malware that isn't a virus...).
1. First, the file would need to be considered "safe" to be allowed to auto-download and auto-open, AND the browser would need to be set to allow this.
2. Then, like the case with the installer above, it would need to seek the user's permission to be installed. This again, required the complicity of the user, who would still need the administrator's password.
How can anything be considered safe in this scenario ? We have a compressed archive and an executable file. Both are rather unsafe. Especially the executable file. I don't care that it is an installer, no executable file is safe. What if the "installer" had some payload code on launch, before privilege escalation ?
This is what I'm interested in knowing, how is this thing packaged so that it gets auto-executed. You aren't answering my question either. I'm technical enough I think that I already understood what you and the Studios guy are "trying to explain to me", but you both fail to understand the underlying question :
Why is this thing auto-executing ? I know it's because Safari considers it safe since the user checked the safe box, that's in the article. I want to know why is an executable file being launched after a zip file was uncompressed and how does Safari know this is "safe" ?
Both of you are only repeating the same stuff that's in the media. I want the details, not the media overview. I want the archive itself if possible. Let's find it, dissect it, understand it. If Apple needs to modify some defaults, let's ask for that.
nixd2001
Oct 8, 04:38 PM
Originally posted by jefhatfield
one thing is certain, the athlon is faster than the duron, the pentium 4 is faster than the celeron, and the G4 is faster (in photoshop) than the G3...but beyond that, it is hard to get a perfect reading
True, but hardly going to provoke torrents of postings of heated debate and disagreement - surely a necessity in modern society :p
my overclocked 2 cents;)
So that's 2 cents of irrational exuberence then?
one thing is certain, the athlon is faster than the duron, the pentium 4 is faster than the celeron, and the G4 is faster (in photoshop) than the G3...but beyond that, it is hard to get a perfect reading
True, but hardly going to provoke torrents of postings of heated debate and disagreement - surely a necessity in modern society :p
my overclocked 2 cents;)
So that's 2 cents of irrational exuberence then?
edifyingGerbil
Apr 27, 02:31 PM
You can give a god any attributes you want.
lol...
Look, in philosophy (and by proxy theology) there is used in debate and arguments definite descriptions. Definite descriptions are used as shorthand to refer to complex ideas so that we do not need to descend into meta-linguistics and logical symbolism which is quite arcane.
Now with regards to the ontological argument for the existence of God, and the "Problem of Evil" and any other argument propounded by a Christian theologian trying to prove God's existence using reason, the definite description "God" is used as shorthand for:
There is an entity such that this entity possesses certain attributes which are defined in certain religious texts called the Bible.
The fact that the Judaeo-Christian God is really the chief of the Ugaritic pantheon doesn't matter because the Ugaritic god doesn't have his attributes listed in the Bible, unlike the Judaeo-Christian god.
You can't give the Judaeo-Christian god any attributes you want, otherwise we would have solved the problem of evil long ago. You can in your imagination give any being any attributes you want but its definite description will include "there is a fictional being such that..." etc.
I hope I'm not being condescending. Maybe you know about definite descriptions and I'm preaching to the converted...
lol...
Look, in philosophy (and by proxy theology) there is used in debate and arguments definite descriptions. Definite descriptions are used as shorthand to refer to complex ideas so that we do not need to descend into meta-linguistics and logical symbolism which is quite arcane.
Now with regards to the ontological argument for the existence of God, and the "Problem of Evil" and any other argument propounded by a Christian theologian trying to prove God's existence using reason, the definite description "God" is used as shorthand for:
There is an entity such that this entity possesses certain attributes which are defined in certain religious texts called the Bible.
The fact that the Judaeo-Christian God is really the chief of the Ugaritic pantheon doesn't matter because the Ugaritic god doesn't have his attributes listed in the Bible, unlike the Judaeo-Christian god.
You can't give the Judaeo-Christian god any attributes you want, otherwise we would have solved the problem of evil long ago. You can in your imagination give any being any attributes you want but its definite description will include "there is a fictional being such that..." etc.
I hope I'm not being condescending. Maybe you know about definite descriptions and I'm preaching to the converted...
supmango
Mar 18, 10:48 AM
+11
The whole "it's MY data, I can do what I want with it!" argument is countered by your perfect analogy with a buffet. I tip my hat to you on that one. If you're at an all-you-can-eat buffet, it doesn't mean you can share your food with your entire family.
I've always believed that unlimited data, on a smartphone, enables you to connect to the internet as much as you want on the device you're contracted to. It's not like home internet where you can share the connection, nor have I ever imagined it would be.
I think that people just like to get "angry at the man" when they don't get things the way they want. ATT is trying to improve their network, good for them.
If AT&T let you keep your "unlimited" data plan AND add tethering, his analogy would work. As it stands right now, AT&T forces you to downgrade to a capped data plan and add tethering to it which essentially doubles your data cap to 2gb.
The analogy is more accurately like a traditional restaurant where you order an entre that is not "all you can eat". But in this case, they don't allow you to share it with another person, even though you could never possibly eat all of it by yourself (use your existing data allotment). However, they are more than happy to let you buy another entre. Oh, and you can't take home your leftovers either (rollover). That does a little better job of highlighting exactly how AT&T is being greedy in this scenario.
Bottom line, what people are doing is sticking with unlimited data and tethering (using some other means), and then downloading gigabits of data which does affect network performance for other users. That is how AT&T sees it. If you are careful about what you do while "illegally" tethering, and how often you do it, I seriously doubt they will figure it out. They really aren't that put together on this, as anyone who has spoken to "customer service" can attest.
The whole "it's MY data, I can do what I want with it!" argument is countered by your perfect analogy with a buffet. I tip my hat to you on that one. If you're at an all-you-can-eat buffet, it doesn't mean you can share your food with your entire family.
I've always believed that unlimited data, on a smartphone, enables you to connect to the internet as much as you want on the device you're contracted to. It's not like home internet where you can share the connection, nor have I ever imagined it would be.
I think that people just like to get "angry at the man" when they don't get things the way they want. ATT is trying to improve their network, good for them.
If AT&T let you keep your "unlimited" data plan AND add tethering, his analogy would work. As it stands right now, AT&T forces you to downgrade to a capped data plan and add tethering to it which essentially doubles your data cap to 2gb.
The analogy is more accurately like a traditional restaurant where you order an entre that is not "all you can eat". But in this case, they don't allow you to share it with another person, even though you could never possibly eat all of it by yourself (use your existing data allotment). However, they are more than happy to let you buy another entre. Oh, and you can't take home your leftovers either (rollover). That does a little better job of highlighting exactly how AT&T is being greedy in this scenario.
Bottom line, what people are doing is sticking with unlimited data and tethering (using some other means), and then downloading gigabits of data which does affect network performance for other users. That is how AT&T sees it. If you are careful about what you do while "illegally" tethering, and how often you do it, I seriously doubt they will figure it out. They really aren't that put together on this, as anyone who has spoken to "customer service" can attest.
lifeinhd
Apr 12, 10:21 PM
This is what iMovie after iMovie '06 should have been, if only because it has a PROPER FRICKIN' TIMELINE!
Was really hoping for $199, but $299 isn't bad. I might just upgrade from iMovie '06 (I'm not really a 'pro' editor, but I love my timelines!).
Was really hoping for $199, but $299 isn't bad. I might just upgrade from iMovie '06 (I'm not really a 'pro' editor, but I love my timelines!).
thejadedmonkey
Apr 12, 11:43 PM
I was just sitting at work, with 3 co-workers today. We were looking at a cut of footage I had from when my organization visited the capitol. Tweak this... 4 minutes later... good, but try moving that there instead... 4 minutes later...
That alone has me all psyched. This was a brand new i5 machine, too. Got in about a week ago. Being able to save 12 minutes moving a clip back and forth by .08 seconds is a lifesaver.
For what I use FCS for, FCPX looks great! With the price drop, somehow it's less money for a MBP than a comparable PC and Adobe... I'm psyched for a new portable setup come next fall!
That alone has me all psyched. This was a brand new i5 machine, too. Got in about a week ago. Being able to save 12 minutes moving a clip back and forth by .08 seconds is a lifesaver.
For what I use FCS for, FCPX looks great! With the price drop, somehow it's less money for a MBP than a comparable PC and Adobe... I'm psyched for a new portable setup come next fall!
mpstrex
Aug 30, 10:32 AM
And for the record, of the 12+ Apples and 3+ iPods I've owned, I've:
1. Donated my 1994 Apple Performa (?), of which I got a lot of mileage out of, to a company that fixed it, removed my data for me, and gave the computer to women who were abused.
2. I've sold all my other Apples to new owners who used them for school, work, etc.
3. I have an old Power Book I sold to my old roommate, whose new roomies dropped it (and his new PC notebook, whoops), and I have it back. I may just sell it to an Apple guru who can repair and use it.
4. My old iPod (Gen 2, 2002) is about to become a special OS X bootable disk; my wife's mini now belongs to her Dad; my other iPod (gen 3 or 4--last black and white one) my wife uses; and I love my iPod video.
No need to throw any of it away, no need to recycle it if others can use it, and I can take the money and buy new Apples or pay some bills, etc.
1. Donated my 1994 Apple Performa (?), of which I got a lot of mileage out of, to a company that fixed it, removed my data for me, and gave the computer to women who were abused.
2. I've sold all my other Apples to new owners who used them for school, work, etc.
3. I have an old Power Book I sold to my old roommate, whose new roomies dropped it (and his new PC notebook, whoops), and I have it back. I may just sell it to an Apple guru who can repair and use it.
4. My old iPod (Gen 2, 2002) is about to become a special OS X bootable disk; my wife's mini now belongs to her Dad; my other iPod (gen 3 or 4--last black and white one) my wife uses; and I love my iPod video.
No need to throw any of it away, no need to recycle it if others can use it, and I can take the money and buy new Apples or pay some bills, etc.
maccompaq
Nov 10, 02:51 PM
I have the iphone 3gs, and at&t has never been able to get their act together with the iPhone but with the os upgrades service seems to keep getting worse.
Do you think problems will be resolved when / if verizon has access to the iphone (effectively lowering the burden on at&t, even thought they probably still wont be able to keep up)
It is the fault of AT&T, not the iPhone. Every call I make gets dropped. It makes no difference if I use my iPhone 4 or my LG phone.
Do you think problems will be resolved when / if verizon has access to the iphone (effectively lowering the burden on at&t, even thought they probably still wont be able to keep up)
It is the fault of AT&T, not the iPhone. Every call I make gets dropped. It makes no difference if I use my iPhone 4 or my LG phone.
iMikeT
Sep 12, 04:14 PM
A sneak peak of a rumored product from Apple?:eek:
markcres
May 2, 10:52 AM
What an amazing coincidence this is being publicised by Intego...who just happen to sell AV software!
No comments:
Post a Comment