Alot of people might be confused about the Cort injunction against Microsoft and the Word 2003/2007 application. Alot of people thought that Microsoft could no longer sell Word as a whole… but thats untrue… the explanation can be read at Internet news. I have included an insert and a link to the article.
“We expect to have copies of Microsoft Word 2007 and Office 2007, with this feature removed, available for U.S. sale and distribution by the injunction date,” Kevin Kutz, Microsoft director of public affairs, said in a statement Tuesday. (Office 2010, which is due for release in June, does not contain the offending code.)
When asked why the company is touting a two-month-old update, however, Microsoft clammed up.
“This is the only comment we have to offer at this time,” a second Microsoft spokesperson said in an e-mail to InternetNews.com.
On Microsoft’s OEM Partner Center site, the update is labeled as “required.”
“After this patch is installed, Word will no longer read the Custom XML elements contained within DOCX, DOCM, or XML files. These files will continue to open, but any Custom XML elements will be removed,” the update’s description says.
Check out the full story here.
Well I spent the good part of the night fighting with my Blackberry thinking that it was my issue… after speaking with fellow engineers, we did some quick google searches and stumbled upon Engadget’s website. What have we here? People saying that their data services are down… as of about 430pm today, and right now its 9:32 CST, and I still cannot use Blackberry Messenger, Google Maps, and Hotmail/BES isn’t forwarding messages to my phone. This does not look good for RIM, alot of people depend on their phones as their soul point of communication… hopefully they fix this by the morning.
We all know what its for, and we all don’t care… sure let a PC go to sleep, but hibernation? I don’t care. Here is a quick and easy way to get rid of the 3-4GB file.
- Go to Start menu, type “cmd” open up command prompt
- Type “powercfg.exe -h off” [make sure you are an Administrator]
- Type “exit”
- Restart your PC and you’ll be good to go!
After restarting that file should be gone, defrag your hard disk to reclaim the unused space.
The Microsoft Exchange Remote Connectivity Analyzer, lets you run all sorts of tests against your exchange server implementation, released alongside of Microsoft Exchange 2010, it is one of the best utilities I have seen for Exchange so far.
Say if someone is having issues connecting to RPC over HTTPS, you can run the check, if it finds an error, it will tell you how to fix it!
Same goes with outgoing mail, you enter the external IP of the exchange server that smtp is sent out of, an email address from within the domain, and run a quick check, it will flag any errors and suggest fixes. It helped me fix an issue with a client just today.
This is the URL, https://www.testexchangeconnectivity.com
I wanted to leave it as such as its from Microsoft and you need to remember the URL 🙂
There is currently no third party exchange aware backup software, but using Windows Server Backup, you are able to backup the data store and flush the logs. Over at Exchange.org, Henrik explains how to use Windows Server Backup to perform backups and restores using the new Exchange volume snapshot plug-in included with Exchange 2010. I included an insert to peak your interest.
But what about Exchange 2010? Will a similar plug-in be available with this Exchange Server version, which is nearing RTM? The answer is yes. And in fact this is exactly what I am going to cover in this article. So sit back and grab a nice cup of fresh coffee and carry on reading.
Check out the article here.
The guardians of the SD memory card specification announced version 3.0 of the technology less than a year ago, but that hasn’t stopped them talking up version 4.0 already..
SD 4.0 cards will have ten pins, one of which will handle the serialised data stream that runs at 300MB/s. The remainder will operate in parallel to deliver backwards compatibility. The size and shape of the card will not change.
Wow, with up to 2TB max size, and 300MB/s were talking about a huge step in technology. Toshiba had the first 64GB SDXC card(60MB Read/ 35MB Write), check out the story here. Once SDXC devices start popping up around the globe, well be looking at more storage, better quality video, bigger and better pictures from cameras, the possibilities are endless. Need more storage for that laptop? Does it have an SDXC slot? just pop in an extra 128GB(doesn’t exist at the moment) and your good to go.
I myself might hold onto my 2GB cards for now, until these cards become mainstream, as each new size gets announced, the price drops, and thats good for everyone!
Check out SDcard.org for the SD 3.0 Specs.
I was rummaging around one of my clients servers when I noticed multiple host records pointing to the same IP address, this can get confusing if left long enough, so Microsoft has DNS Scavenging to help clean up that mess. With Scavenging enabled, when a record becomes a certain age, it gets purged from DNS, if say a good DNS entry gets purges, say someone went on vacation or something for a couple weeks, then next time he logs into the domain the DNS entry for that person will be re-registered. Below are a few tips about DNS Scavenging.
- Verify that DNS Scavenging is enabled in the server Advanced properties. See KB article 932464 (Server 2003) or this link (Server 2008)
- Verify that the zone in question has scavenging/Aging enabled.
- Verify that the record(s) have a timestamp. In the DNS MMC, select View\ Advanced and then right-click the record and select properties.
- Record time stamp must be older than the combination of the No-refresh + Refresh intervals to be subject to scavenging. Be aware that automatic scavenging of the zone will not occur until the DNS Server service has been running for a period of time equal to the Refresh Interval set on the zone.
- To initiate a scavenge manually, in the DNS MMC, right-click on the DNS server and select “Scavenge stale resource records”.
- If no one updates record between No-refresh + Refresh intervals, record will be marked as stale, and will be removed from DNS MMC but will exist under MicrosoftDNS container. “dNSTombstone” attribute will change to “True” when record become stale.
- If a large number of records do not have a timestamp and are in need of having one set (to be subject to scavenging), the dnscmd utility can be used to accomplish this. Note: using this utility to force the aging of all records in a zone will cause records for hosts that are not dynamically updated to eventually be scavenged from the zone. USE THIS WITH CAUTION: The /ageallrecords will affect all records within DNS, even manually added records.
Okay so vSphere 4 Update one was released late last month, and some people went ahead and downloaded the ISO not knowing that vSphere Host Update Utility would fail to recognize the ISO as useful media. The simple answer is to update the vSphere Host Update Utility to its current version. Once that is done you can use the vSphere 4.0 Update 1 ISO, put the ESX host into Maintenance Mode and update the host from 3.x and up. Good Luck.
Hey Everyone I was working on an Exchange 2003 to 2010 migration and came up with an error similar to this…
Microsoft.Exchange.MailboxReplicationService.MailboxReplicationTransientException: Exception details: MailboxReplicationTransientException (80040111): Mailbox database ‘a0dd10be-fa72-47bd-afb4-0bf3ce7175e’ is offline
I went to Exchange System Management on the Exchange 2003 Server and right clicked on the First Storage Group\Mailbox Store; Security Tab; then added a Computer Object(EXG01) and gave it Full Control… I was then able to use Exchange Server 2010 to utilize the “New Local Move Request” to the Exchange 2010 Database…
How did i figure this out? I found this link to Social.Technet.Microsoft.com and based on former user posts, figured out the solution… i also posted the answer there 🙂
I was the able to login to OutlookWebApp and check my mail…
Hope this helps out some people!
Intel’s new processor, each core being an Atom based processor, put 48 of them together and they become one powerful collective.
Pushing several steps farther in the multicore direction, Intel on Wednesday demonstrated a fully programmable 48-core processor it thinks will pave the way for massive data computers powerful enough to do more of what humans can.
The processor was even shown to boot into Windows and Linux Boxes. Check out the full story here.