Chupa Chupa
Apr 28, 07:52 AM
Very true. Plus it could be a fad to own the latest toy. We won't know until some time passes. Anything new from Apple gets a lot of attention.
Wait til the newness wears off.
Well you have a point there. The iPod was a so-called fad too. It took 8 or 9 years for it to wear off and see fickle consumers switch to the next fad, the iPhone and iPad. The iPad-like devices may be a fad but it's likely to die out b/c a it's replaced by a next gen device rather than boredom. Apple is already showing it's cards in melding OS X with hints of iOS.
Wait til the newness wears off.
Well you have a point there. The iPod was a so-called fad too. It took 8 or 9 years for it to wear off and see fickle consumers switch to the next fad, the iPhone and iPad. The iPad-like devices may be a fad but it's likely to die out b/c a it's replaced by a next gen device rather than boredom. Apple is already showing it's cards in melding OS X with hints of iOS.
fleggy
Mar 18, 01:58 PM
When are you all going to realize that this is marketing fluff?
Let me give you a possible scenario...(something to lighten the mood)
AT&T Infrastructure: Wow - these new smart phones use a lot of data. We need to restrict it.
AT&T Marketing: Yes, well, we can't tell customers the restrictions - it will lose us business. I want to tell them it is unlimited!
AT&T Infrastructure: No way...it will kill us - especially with tethering! I'd be happy with it restricted to the smart phone only.
AT&T Legal: We can insert a clause...restricting to this device only...no tethering.
AT&T Marketing: Yes, yes! I can just mention and promote unlimited, and the actual usage can be buried in the ToS. I like it.
AT&T release "unlimited data for the iPhone" knowing full well that even if your iPhone downloads 24x7 - their network can handle it (although this will never happen in reality).
Everyone flocks to buy it and SIGN UP.
Selecting which part of the service to market IS mis-leading, however...it is pretty clear - "this device only".
Everything in America is like this. Marketing is a black art form here!! You can't pick and choose which parts of the marketing and ToS you like!
Let me give you a possible scenario...(something to lighten the mood)
AT&T Infrastructure: Wow - these new smart phones use a lot of data. We need to restrict it.
AT&T Marketing: Yes, well, we can't tell customers the restrictions - it will lose us business. I want to tell them it is unlimited!
AT&T Infrastructure: No way...it will kill us - especially with tethering! I'd be happy with it restricted to the smart phone only.
AT&T Legal: We can insert a clause...restricting to this device only...no tethering.
AT&T Marketing: Yes, yes! I can just mention and promote unlimited, and the actual usage can be buried in the ToS. I like it.
AT&T release "unlimited data for the iPhone" knowing full well that even if your iPhone downloads 24x7 - their network can handle it (although this will never happen in reality).
Everyone flocks to buy it and SIGN UP.
Selecting which part of the service to market IS mis-leading, however...it is pretty clear - "this device only".
Everything in America is like this. Marketing is a black art form here!! You can't pick and choose which parts of the marketing and ToS you like!
jiggie2g
Jul 12, 04:18 PM
we are not saying conroe is crap it just is not suitable for a mac pro.
My point exactly...Mac Snobbery at it's finest.
My point exactly...Mac Snobbery at it's finest.
ddtlm
Oct 12, 03:30 PM
Wow I missed a lot by spending all of Friday away from this board. I am way behind in posts here, and I'm sure I'll miss a lot of things worth comment. But anyway, the code fragment:
int x1,x2,x3;
for (x1=1; x1<=20000; x1++) {
for(x2=1; x2<=20000; x2++) {
x3 = x1*x2;
}
}
Is a very poor benchmark. Compilers may be able to really dig into that and make the resulting executable perform the calculate radically different. In fact, I can tell you the answer outright: x1=20000, x2=20000, x3 = 400000000. It took me 2 seconds or so. Does this mean that I am a better computer than a G4 and a P4? No, it means I realized that the loop can be reduced to simple data assignments. I have a better compiler, thats it.
Anyway, lets pretend that for whatever reason compilers did not simplify that loop AT ALL. Note that this would be a stupid stupid compiler. At each stage, x1 is something, we ++x2, and we set x3 = x1 * x2. Now notice that we cannot set x3 until the result of X2++ is known. On a pipelined processor that cannot execute instructions out of order, this means that I have a big "bubble" in the pipeline as I wait for the new x2 before I can multiply. However, after the x3 is started into the pipe, the next instruction is just another x2++ which does not depend on x3, so I can do it immediately. On a 7-stage in-order chip like a G4, this means that I fill two stages of the pipe and then have to wait for the results on the other end before I can continue. You see that this is very inefficient (28% or so). However, the G3 is a 4-stage design and so 2/4 of the stages can stay busy, resulting in a 50% efficientcy (so a 700mhz G3 is "the same as" a 350mhz G3 at 100% and a 800mhz G4 is "the same as" a 210mhz G4 at 100%). These are of course simplified cases, the actual result may very a bit for some obscure reason.
Actually the above stuff is inaccurate. The G3 sports 2 integer units AFAIK, so it can do x3 = x1*x2 at the same time as it is doing x2++ (for the next loop of course, not this one). This means that both pipes start one bit of work, then wait for it to get out the other end, then do one bit of work again. So this is 25% efficientcy. A hypothetical single-pipe G3 would do x3 = x1*x3 and then do x2++, however it could not do x3 = x1 * x2 again until the x2++ was out the other end, which takes 4 cycles and started one after the previos x3 = x1*x2, which should mean 3 "bubble" stages and an efficientcy of 20%.
Actually, it may be worse than that. Remember that this is in a loop. The loop means a compare instruction (are we done yet?) followed by a jump depending on the results of the compare. We therefore have 4 instructions in PPC I think per loop, and we can't compare x2 to 20000 until x2++ has gone through all the pipe stages. (Oh no!) And we can't jump until we know r]the result of the compare (oh no!). Seeing the pattern? Wanna guess what the efficientcy is for a really stupid compiled version of this "benchmark"? A: really freaking low.
I'll see about adding more thoughts later.
int x1,x2,x3;
for (x1=1; x1<=20000; x1++) {
for(x2=1; x2<=20000; x2++) {
x3 = x1*x2;
}
}
Is a very poor benchmark. Compilers may be able to really dig into that and make the resulting executable perform the calculate radically different. In fact, I can tell you the answer outright: x1=20000, x2=20000, x3 = 400000000. It took me 2 seconds or so. Does this mean that I am a better computer than a G4 and a P4? No, it means I realized that the loop can be reduced to simple data assignments. I have a better compiler, thats it.
Anyway, lets pretend that for whatever reason compilers did not simplify that loop AT ALL. Note that this would be a stupid stupid compiler. At each stage, x1 is something, we ++x2, and we set x3 = x1 * x2. Now notice that we cannot set x3 until the result of X2++ is known. On a pipelined processor that cannot execute instructions out of order, this means that I have a big "bubble" in the pipeline as I wait for the new x2 before I can multiply. However, after the x3 is started into the pipe, the next instruction is just another x2++ which does not depend on x3, so I can do it immediately. On a 7-stage in-order chip like a G4, this means that I fill two stages of the pipe and then have to wait for the results on the other end before I can continue. You see that this is very inefficient (28% or so). However, the G3 is a 4-stage design and so 2/4 of the stages can stay busy, resulting in a 50% efficientcy (so a 700mhz G3 is "the same as" a 350mhz G3 at 100% and a 800mhz G4 is "the same as" a 210mhz G4 at 100%). These are of course simplified cases, the actual result may very a bit for some obscure reason.
Actually the above stuff is inaccurate. The G3 sports 2 integer units AFAIK, so it can do x3 = x1*x2 at the same time as it is doing x2++ (for the next loop of course, not this one). This means that both pipes start one bit of work, then wait for it to get out the other end, then do one bit of work again. So this is 25% efficientcy. A hypothetical single-pipe G3 would do x3 = x1*x3 and then do x2++, however it could not do x3 = x1 * x2 again until the x2++ was out the other end, which takes 4 cycles and started one after the previos x3 = x1*x2, which should mean 3 "bubble" stages and an efficientcy of 20%.
Actually, it may be worse than that. Remember that this is in a loop. The loop means a compare instruction (are we done yet?) followed by a jump depending on the results of the compare. We therefore have 4 instructions in PPC I think per loop, and we can't compare x2 to 20000 until x2++ has gone through all the pipe stages. (Oh no!) And we can't jump until we know r]the result of the compare (oh no!). Seeing the pattern? Wanna guess what the efficientcy is for a really stupid compiled version of this "benchmark"? A: really freaking low.
I'll see about adding more thoughts later.
Cox Orange
Apr 20, 06:28 AM
as said before...
remove programs
apple + <--
how to easily open a new tab
apple + T
remove programs
apple + <--
how to easily open a new tab
apple + T
Doctor Q
Mar 18, 03:54 PM
I'm not pleased with this development, because Apple's DRM is necessary to maintain the compromise they made with the record labels and allow the iTunes Music Store to exist in the first place. If the labels gets the jitters about how well Apple is controlling distribution, that threatens a good part of our "supply" of music, even though I wouldn't expect a large percentage of mainstream customers to actually use a program like PyMusique.
Will Apple be able to teach the iTunes Music Store to distinguish the real iTunes client from PyMusique with software changes only on the server side? If not, I imagine that only an iTunes update (which people would have to install) could stop the program from working.
Suppose iTunes is updated to use a new "secret handshake" with the iTunes Music Store in order to stop other clients from spoofing iTunes. Will iTunes have any way to distinguish tunes previously purchased through PyMusique from tunes acquired from other sources, i.e., ripped from CDs? Perhaps the tags identify them as coming from iTMS and iTunes could apply DRM after the fact. Then again, tags can be removed.
Will Apple be able to teach the iTunes Music Store to distinguish the real iTunes client from PyMusique with software changes only on the server side? If not, I imagine that only an iTunes update (which people would have to install) could stop the program from working.
Suppose iTunes is updated to use a new "secret handshake" with the iTunes Music Store in order to stop other clients from spoofing iTunes. Will iTunes have any way to distinguish tunes previously purchased through PyMusique from tunes acquired from other sources, i.e., ripped from CDs? Perhaps the tags identify them as coming from iTMS and iTunes could apply DRM after the fact. Then again, tags can be removed.
Cape Cod Rick
Jul 7, 06:00 AM
I love my new IPhone 4. However, I am dropping many more calls with the IPhone 4 than I did with IPhone 3G!! Yesterday, my phone dropped 3 calls- even when I was holding the phone with only two fingers and away from the bottom!!
flopticalcube
Apr 22, 10:58 PM
On other forums, people complain about the word agnostic.
>agnostic theist- I believe in god, but have no knowledge of him.
>agnostic atheist- I don't belief in god, but I don't claim a special source of knowledge for that disbelief
>gnostic theist-I know that is a god!
>gnostic atheist-I know there is no god with the same degree of certainty that the theist knows there is one.
I don't think that many would call themselves a gnostic atheist, I certainly don't.
Dawkins might. As I said before, most atheists are agnostic atheists.
>agnostic theist- I believe in god, but have no knowledge of him.
>agnostic atheist- I don't belief in god, but I don't claim a special source of knowledge for that disbelief
>gnostic theist-I know that is a god!
>gnostic atheist-I know there is no god with the same degree of certainty that the theist knows there is one.
I don't think that many would call themselves a gnostic atheist, I certainly don't.
Dawkins might. As I said before, most atheists are agnostic atheists.
MykeHamilton
Apr 28, 08:15 AM
This is because they have continued to put time and money in to iOS and not Mac. They have been lazy and done practically done nothing with desktops and their notebooks. They need to start putting emphasis on to Macs now.
skunk
Mar 26, 01:31 PM
relationships built on love in general are less stable, cf. US divorce rate.Do you have a source for this extraordinary claim?
Mr. Gates
May 2, 03:59 PM
Macs are more vulnerable than people think.
They just have such a lower market share and percentage of users than Microsoft that its not worth it to write malware and virus's for them.
As Apple and OSX grows, this kind of thing will become more common and Apple will be more at risk
They just have such a lower market share and percentage of users than Microsoft that its not worth it to write malware and virus's for them.
As Apple and OSX grows, this kind of thing will become more common and Apple will be more at risk
ffakr
Oct 6, 12:00 AM
I must love punishment because I scanned this whole tread. We need some sort system to gather the correct info into one location. :-)
Multimedia, you're so far out of mainstream that your comments make no sense to all but .01 % of computer users.
Seriously.. Most people don't rip 4 videos to h264 while they are creating 4 disk images and browsing the web.
I work at a wealthy research university, I set up a new mac every week (and too many PCs). A 1st Gen dual 2.0 G5 is plenty fast for nearly all users. I'm still surprised how nice ours runs considering it's 3 years old. In my experience the dual cores are more responsive (UI latency) but a slightly faster dual proc will run intensive tasks faster.
The reality is, a dual core system.. any current dual core system.. is a fantastic machine for 95% of computer users. The Core2 Duo (Merom) iMacs are extermely fast. The 24" iMac with 2GB ram runs nearly everything instantaneously.
The dual dual-core systems are rediculously fast. Iv'e set up several 2.66GHz models and I had to invent tasks to slow the thing down. Ripping DVD to h264 does take some time with handbrake (half playback speed ((that's ripping 1hour of DVD in 30 minutes) but the machine is still very responsive while you're doing that, installing software, and having Mathematica calculate Pi to 100,000 places. During normal use (Office, web, mail, chats...) it's unusual to see any of the cpu cores bump up past 20%.
I'm sure Apple will have 4 core cpus eventually but I don't expect it will happen immediately. Maybe they'll have one top end version but it'd certainly be a mistake to move the line to all quad cores.
Here's the reality...
- fewer cores running faster will be much better for most people
- there are relatively few tasks that really lend themselves to massively parallelizaton well. Video and Image editing are obvious because there are a number of ways to slice jobs up (render multiple frames.. break images into sections, modify in parallel, reassemble...).
- though multimedia is an Apple core market.. not everyone runs a full video shop or rending farm off of one desktop computer. Seriously guys, we don't.
- Games are especially difficult to thread for SMP systems. Even games that do support SMP like Quake and UT do it fairly poorly. UT only splits off audio work on to the 2nd cpu. The real time nature of games means you can't have 7 or 8 independent threads on an 8 core systems without running into issues were the game hangs up on a lagging thread. They simply work better in a more serial paradigm.
- The first quad core chips will be much hotter than current Core2 chips. Most people.. even people who want the power of towers.. don't want a desktop machine that actually pulls 600W from the wall because of the two 120-130W cpus inside. also, goodby silent MacPros in this config.
- The systems will be far too I/O bound in an 8 core system. The memory system does have lots of bandwith but the benchmarks indicate it will be bus and memory constrained. It'll certainly be hard to feed data from the SATA drives unless you've got gobs of memory and your not working on large streams of data (like video).
http://www.tomshardware.com/2006/09/10/four_cores_on_the_rampage/
Finally, Apple's all about the perception. Apple has held back cpu releases because they wouldn't let a lower end cpu clock higher than a higher end chip. They did it with PPC 603&604 and I think they did it with G3 & G4.
It's against everything Apple's ever done to have 3.0 GHz dual dual-core towers in the mid range and 2.33GHz quad-core cpus in the high end.
I see some options here..
Maybe we'll get the dual 2.66 quad cores in one high end system. The price will go up.
Alternately.. this could finally be a rumored Mac Station.. or.. Apple has yet to announce a cluster node version of the intel XServe.
Geez.. almost forgot.
For most people... the Core2 desktop systems bench better than the 4core systems or even the dual Core2 Xeon systems because the DDR2 is lower latency than the FBDIMMs. To all the gamers.. you don't want slower clocked quad core chips.. not even on the desktop. You want a speed bump of the Core2 Duo.
Multimedia, you're so far out of mainstream that your comments make no sense to all but .01 % of computer users.
Seriously.. Most people don't rip 4 videos to h264 while they are creating 4 disk images and browsing the web.
I work at a wealthy research university, I set up a new mac every week (and too many PCs). A 1st Gen dual 2.0 G5 is plenty fast for nearly all users. I'm still surprised how nice ours runs considering it's 3 years old. In my experience the dual cores are more responsive (UI latency) but a slightly faster dual proc will run intensive tasks faster.
The reality is, a dual core system.. any current dual core system.. is a fantastic machine for 95% of computer users. The Core2 Duo (Merom) iMacs are extermely fast. The 24" iMac with 2GB ram runs nearly everything instantaneously.
The dual dual-core systems are rediculously fast. Iv'e set up several 2.66GHz models and I had to invent tasks to slow the thing down. Ripping DVD to h264 does take some time with handbrake (half playback speed ((that's ripping 1hour of DVD in 30 minutes) but the machine is still very responsive while you're doing that, installing software, and having Mathematica calculate Pi to 100,000 places. During normal use (Office, web, mail, chats...) it's unusual to see any of the cpu cores bump up past 20%.
I'm sure Apple will have 4 core cpus eventually but I don't expect it will happen immediately. Maybe they'll have one top end version but it'd certainly be a mistake to move the line to all quad cores.
Here's the reality...
- fewer cores running faster will be much better for most people
- there are relatively few tasks that really lend themselves to massively parallelizaton well. Video and Image editing are obvious because there are a number of ways to slice jobs up (render multiple frames.. break images into sections, modify in parallel, reassemble...).
- though multimedia is an Apple core market.. not everyone runs a full video shop or rending farm off of one desktop computer. Seriously guys, we don't.
- Games are especially difficult to thread for SMP systems. Even games that do support SMP like Quake and UT do it fairly poorly. UT only splits off audio work on to the 2nd cpu. The real time nature of games means you can't have 7 or 8 independent threads on an 8 core systems without running into issues were the game hangs up on a lagging thread. They simply work better in a more serial paradigm.
- The first quad core chips will be much hotter than current Core2 chips. Most people.. even people who want the power of towers.. don't want a desktop machine that actually pulls 600W from the wall because of the two 120-130W cpus inside. also, goodby silent MacPros in this config.
- The systems will be far too I/O bound in an 8 core system. The memory system does have lots of bandwith but the benchmarks indicate it will be bus and memory constrained. It'll certainly be hard to feed data from the SATA drives unless you've got gobs of memory and your not working on large streams of data (like video).
http://www.tomshardware.com/2006/09/10/four_cores_on_the_rampage/
Finally, Apple's all about the perception. Apple has held back cpu releases because they wouldn't let a lower end cpu clock higher than a higher end chip. They did it with PPC 603&604 and I think they did it with G3 & G4.
It's against everything Apple's ever done to have 3.0 GHz dual dual-core towers in the mid range and 2.33GHz quad-core cpus in the high end.
I see some options here..
Maybe we'll get the dual 2.66 quad cores in one high end system. The price will go up.
Alternately.. this could finally be a rumored Mac Station.. or.. Apple has yet to announce a cluster node version of the intel XServe.
Geez.. almost forgot.
For most people... the Core2 desktop systems bench better than the 4core systems or even the dual Core2 Xeon systems because the DDR2 is lower latency than the FBDIMMs. To all the gamers.. you don't want slower clocked quad core chips.. not even on the desktop. You want a speed bump of the Core2 Duo.
slotcarbob
Feb 23, 02:23 PM
Android is going to do what Windows did. Those who like that Windows experience (read "cheap") are going to go in that direction. Those that want the elegant, minimalistic, rock solid OS, continue to stay with iPhone.
One thing I did notice though, in any numbers comparisons. Apple sells one phone, with one OS, and currently with one carrier (a hated one, btw). Android is running on several phones, and many carriers. The actual comparison is flawed. Let me suggest this. If one gets a choice of 'Droid or iP (from a carrier that offers both) , the iP will win out, even if the iP is a bit more expensive.
On the subject of price, there is a good chance that Apple may be able to undercut others because they could be using their own chips, soon.
Lastly, I have tried both types of phones. Are you kidding me? 'Drois software is absolutely awful.
One thing I did notice though, in any numbers comparisons. Apple sells one phone, with one OS, and currently with one carrier (a hated one, btw). Android is running on several phones, and many carriers. The actual comparison is flawed. Let me suggest this. If one gets a choice of 'Droid or iP (from a carrier that offers both) , the iP will win out, even if the iP is a bit more expensive.
On the subject of price, there is a good chance that Apple may be able to undercut others because they could be using their own chips, soon.
Lastly, I have tried both types of phones. Are you kidding me? 'Drois software is absolutely awful.
xStep
Apr 13, 03:40 AM
You can find some (not great) video of the event here: http://www.youtube.com/user/selfsponsored05
ct2k7
Oct 7, 03:27 PM
What are you guys talking about?
Didn't Adobe just show a new Flash IDE that generates native iPhone Apps ?
CIA Director Leon Panetta said
Omar in Laden said Tehran had
osama bin laden dead
reports of in laden Said
Didn't Adobe just show a new Flash IDE that generates native iPhone Apps ?
edifyingGerbil
Apr 26, 12:32 PM
Christianity, especially Catholicism has it's own colorful (blood red) history.
As I said elsewhere there is no moral equivalence. It took Augustine's and Aquinas' great rambling treatises to justify warfare, for instance.
In the Qur'an and the Hadith war is encouraged and its virtues extolled.
I wish people would stop trying to equate the wars of Christianity (and of that mainly Western Christianity) with Islam's modern terrorism and calls for warfare against the infidel.
In Islamic Law non-muslims are considered najiss, that means ritually impure, down to our souls, our essences. Christians are reviled especially because they practice "shirk", a law forbidding the joining of others to allah. Jews are designated as apes and pigs in the Qur'an.
there is no equivalence between Islam and Christianity.
As I said elsewhere there is no moral equivalence. It took Augustine's and Aquinas' great rambling treatises to justify warfare, for instance.
In the Qur'an and the Hadith war is encouraged and its virtues extolled.
I wish people would stop trying to equate the wars of Christianity (and of that mainly Western Christianity) with Islam's modern terrorism and calls for warfare against the infidel.
In Islamic Law non-muslims are considered najiss, that means ritually impure, down to our souls, our essences. Christians are reviled especially because they practice "shirk", a law forbidding the joining of others to allah. Jews are designated as apes and pigs in the Qur'an.
there is no equivalence between Islam and Christianity.
spetznatz
Jul 13, 11:24 AM
[The majority of Mac users use Adobe products] Sad but true and I wish Apple would release something to go up against Photoshop.
Well, you could try this...
http://www.kanzelsberger.com/pixel/?page_id=12
It's still a bit flaky in beta, and the interface is a Windows / Linux clone, but at least it's Universal Binary!!!:D
Oh, yeah, and it's only $32 if you buy now.
Now would I be stirring up a hornets' nest if I asked if it was too much to hope that the lower-end pro's would have a single Woodcrest and an open socket?
Right, where did I put my tin helmet?....
Well, you could try this...
http://www.kanzelsberger.com/pixel/?page_id=12
It's still a bit flaky in beta, and the interface is a Windows / Linux clone, but at least it's Universal Binary!!!:D
Oh, yeah, and it's only $32 if you buy now.
Now would I be stirring up a hornets' nest if I asked if it was too much to hope that the lower-end pro's would have a single Woodcrest and an open socket?
Right, where did I put my tin helmet?....
slinger1968
Nov 2, 08:24 PM
Don't know if you saw this article, I thought I would provide it for your review.
http://reviews.cnet.com/Intel_Core_2_Extreme_QX6700/4505-3086_7-32136314.html?tag=cnetfd.mt
That's the Kentsfield chip not the Clovertown (Xeon) CPU but the benchmarks are interesting.
Just as expected the Quad cores are only going to be a big improvement for the software that can utilize them. Software will catch up with multicores, hopefully by Q2 07 when I'll be buying a new machine.
http://reviews.cnet.com/Intel_Core_2_Extreme_QX6700/4505-3086_7-32136314.html?tag=cnetfd.mt
That's the Kentsfield chip not the Clovertown (Xeon) CPU but the benchmarks are interesting.
Just as expected the Quad cores are only going to be a big improvement for the software that can utilize them. Software will catch up with multicores, hopefully by Q2 07 when I'll be buying a new machine.
D4F
Apr 28, 09:44 AM
Isn't this misleading? It says 'shipped' not 'sold' so I assume basically it's a bogus report. You can ship all the crappy tablets you want..doesn't mean they sold.
I'm trying to find more on it but as far as i've read somewhere apple's data is always on units shipped including those that were used as warranty replacements (pretty much they count one as two in this case) for example. Waaay stretched in my opinion.
I'm trying to find more on it but as far as i've read somewhere apple's data is always on units shipped including those that were used as warranty replacements (pretty much they count one as two in this case) for example. Waaay stretched in my opinion.
rasmasyean
Mar 11, 10:17 PM
Wikipedia seems to be kept up to date. If you have something new, maybe you guys can add it to this...if someone didn't beat you to it. ;)
http://en.wikipedia.org/wiki/2011_Sendai_earthquake_and_tsunami
http://en.wikipedia.org/wiki/2011_Sendai_earthquake_and_tsunami
kupua
Oct 16, 09:00 AM
Ballmer should consider giving a marketing contract to Gartner!
edifyingGerbil
Apr 24, 06:44 PM
You and I have a terribly different definition of ruins I suppose. I consider a place ruins when its not even inhabitable.
Well if you were to look at world history, rather than just look at the world through a religious lens, you'd know the reasons for ongoing conflicts in much of that section of the world. Hint: it tends to do with imperialists powers tamperings.
Also, where is the biggest muslim population in the world? ;)
Most Islamic countries are not inhabitable by homosexuals or religious minorities, your mileage may vary.
The biggest muslim population right now is Indonesia, and they tried banning Christians from using Allah to describe their God. They're also trying to ban the Ahmadiyah sect...
I don't think France or Britain are responsible for Iran's strict implementation of Islamic law and ruthless persecution of dissidents, and to claim that they are responsible is insulting to Muslims because it implies they're far too reactionary to deal with anything using Reason. Just like people who want to ban qur'an burnings and blasphemy because they're afraid of how muslims might react. Are Muslims animals who are so easily goaded? No, they're human beings so they should be expected to act responsibly and not go on rampages at the slightest provocation.
Well if you were to look at world history, rather than just look at the world through a religious lens, you'd know the reasons for ongoing conflicts in much of that section of the world. Hint: it tends to do with imperialists powers tamperings.
Also, where is the biggest muslim population in the world? ;)
Most Islamic countries are not inhabitable by homosexuals or religious minorities, your mileage may vary.
The biggest muslim population right now is Indonesia, and they tried banning Christians from using Allah to describe their God. They're also trying to ban the Ahmadiyah sect...
I don't think France or Britain are responsible for Iran's strict implementation of Islamic law and ruthless persecution of dissidents, and to claim that they are responsible is insulting to Muslims because it implies they're far too reactionary to deal with anything using Reason. Just like people who want to ban qur'an burnings and blasphemy because they're afraid of how muslims might react. Are Muslims animals who are so easily goaded? No, they're human beings so they should be expected to act responsibly and not go on rampages at the slightest provocation.
Red-red
Apr 9, 08:26 PM
Sorry I have such a small brain.
I never said you had a small brain at any point nor did I ever insinuate you were stupid.
Not understanding something doesn't make anyone stupid.
Apple really messed up hiring those 2 guys with years of experience working in the gaming industry. They could have just hired you. A person who has all the answers and can see the future.
They've hired two people who work in PR. Probably for their contacts and influence. Their hiring has little to do with Apple's direction into gaming.
They're already well established and have their direction planned out. All you need to do is connect the dots to see where they're heading.
In all seriousness. I am a gamer and a consumer, and if Apple wants to make gaming a MORE serious part of there business, then I want a controller with buttons and a console or someway to stream off of the Internet.
You're not getting a controller with buttons. It isn't happening.
You have to look at it not by what "you" want "now". It's typical of tech forums because people find it very hard to distinguish between the two.
Is it what I want? I'm not so sure. It's an interesting concept and the potential is certainly huge not only for gaming but app's.
Whether we'll see fully fledged games like we're used to or it'll continue to be a foray into the casual is something else that we don't really know yet. It's up to developers to create the apps & games to drive the platform and ecosystem forward but the potential would certainly be there and as we see more and more people shift toward these devices the desire for more complex forms of entertainment will increase.
We've only scratched the surface.
I never said you had a small brain at any point nor did I ever insinuate you were stupid.
Not understanding something doesn't make anyone stupid.
Apple really messed up hiring those 2 guys with years of experience working in the gaming industry. They could have just hired you. A person who has all the answers and can see the future.
They've hired two people who work in PR. Probably for their contacts and influence. Their hiring has little to do with Apple's direction into gaming.
They're already well established and have their direction planned out. All you need to do is connect the dots to see where they're heading.
In all seriousness. I am a gamer and a consumer, and if Apple wants to make gaming a MORE serious part of there business, then I want a controller with buttons and a console or someway to stream off of the Internet.
You're not getting a controller with buttons. It isn't happening.
You have to look at it not by what "you" want "now". It's typical of tech forums because people find it very hard to distinguish between the two.
Is it what I want? I'm not so sure. It's an interesting concept and the potential is certainly huge not only for gaming but app's.
Whether we'll see fully fledged games like we're used to or it'll continue to be a foray into the casual is something else that we don't really know yet. It's up to developers to create the apps & games to drive the platform and ecosystem forward but the potential would certainly be there and as we see more and more people shift toward these devices the desire for more complex forms of entertainment will increase.
We've only scratched the surface.
WiiDSmoker
Apr 20, 10:04 PM
Also built-in to the OS just go to settings-->personnel hotspot and flick the switch to on after heeding advice that additional charges may apply consult your carrier.
A file system could be useful, better notifications I can really understand.
"real" multitasking no-one has every been able to define a real world use that suggests that Apple's take on mobile multi-tasking means I'm missing out of function.
I know it's not "real" ie programme has free-rain to do what it pleases in the background. But how is it anymore than a marketing tag for geeks?
Outside of Apple's app and music apps, all other applications go into a saved state; i.e. not running in the background.
A file system could be useful, better notifications I can really understand.
"real" multitasking no-one has every been able to define a real world use that suggests that Apple's take on mobile multi-tasking means I'm missing out of function.
I know it's not "real" ie programme has free-rain to do what it pleases in the background. But how is it anymore than a marketing tag for geeks?
Outside of Apple's app and music apps, all other applications go into a saved state; i.e. not running in the background.
Hiç yorum yok:
Yorum Gönder