Licensing for SQL Server 2012

http://www.mssqltips.com/sqlservertip/2942/understanding-the-sql-server-2012-licensing-model/

http://sqlmag.com/sql-server-2012/sql-server-2012-editions

Editions and Components of SQL Server 2012

http://technet.microsoft.com/en-us/library/ms144275.aspx

Compute Capacity Limits by Edition of SQL Server

http://technet.microsoft.com/en-us/library/ms143760.aspx

Features Supported by the Editions of SQL Server 2012

http://technet.microsoft.com/en-us/library/cc645993.aspx

Advertisements

What is IOPS and how to calculate IOPS for an application (Exchange, SQL, Sharepoint,etc.)

What is it and how to calculate IOPS

This question is frequent, mostly because how Dell solutions Consultant that is a hardware manufacturer we know.

What are IOPS?

Is the number of operations per second that an individual disk can arrive. For example, a 10 k SAS disk achieves on average 140 IOPS.

This speed is standard in the industry with variations between models, but we can have a base of what is acceptable and the disk manufacturer can tell you this number.

However, note that the difference is very large, especially taking into account the new SSD drives. For example, the disk Intel x 25-E (See the pdf with the characteristics http://download.intel.com/design/flash/nand/extreme/extreme-sata-ssd-datasheet.pdf) arrives at numbers 30 times larger than the SAS and SATA disks.

 

model

Because the IOPS is so important?

This question is obvious, but the explanation may not be so simple. It turns out that in most cases we have the tendency to minimize the issue saying it is “performance” or “user perception” but in fact can impact directly on the running of an application, in some cases may even paralyze the application.

For example, an Exchange 2003 environment with 2 thousand mailboxes need 1.5 thousand IOPS and this number is not easy to achieve. The SQL Server database to a SharePoint needs 5 thousand IOPS to work.

How to calculate the IOPS?

Multiply the total number of disks for the RAID type and will get your number. Here some examples:

table

RAID 1, RAID 10 or RAID 0 will give you proportional the largest number of IOPS possible because RAID 5,  the calculation takes into account disk 1 and unless 2 disks RAID 50 unless parities.

How to achieve the highest IOPS possible with greater capacity?

We have three ways to do this:

  1. Use high-performance disks as the 15 k SAS or SSD, but are expensive and in the case of SSD only sizes 32/50/64/100 GB
  2. Use the proper RAID type for the performance and not to the size you want as many today make, which often involves using RAID 10 for total performance rather than RAID 50, we would lose into ability but we gain in performance
  3. Buy a storage that works with virtual LUNs, i.e. it allocates the data on the disks as the need for this given and does not need to say the RAID type

What are virtual LUNs?

Let’s not go into technical point since this is much more complex, but we can understand what is this new technology without becoming experts in storage.

Using the Dell arrays as an example, the MD3200i works with LUNs in the normal way we know. You indicate that the disks X the Y form a RAID 0, Z to W RAID 5 and so on. I.e. we map directly to the disks and are dependent on the ability of each individual IO.

Already in the series EqualLogic we can define the size of the LUN without indicate the disks and storage will automatically allocate more data accessed on disks faster (!!!!). You must be thinking it is joke or something like “concept”, but it is not!!

The new arrays sold by Dell, EMC, IBM and others are smart and allow you to mix the disks. For example, can I put in SSD disks storage drawer and one more additional drawer with 24 SAS 15 k disks and not worry if the LUN created is the disks more performance, who will do this job is storage.

And, the more interesting, when the storage “perceive” that certain data (LUN) is more accessible than another it will relocate to faster disks and make the shift of data without performance loss and intervention, since works in background and automatic!!!!

Interesting references

How to calculate IOPS for Exchange 2003 http://technet.microsoft.com/en-us/library/bb125019 (EXCHG. 65) .aspx

How to calculate IOPS for Exchange 2010 http://technet.microsoft.com/en-us/library/ee832791.aspx

How to calculate IOPS for the SharePoint 2010 and SQL http://technet.microsoft.com/en-us/library/cc298801.aspx

Utility to measure IOPS for SQL Server (SQLIO) http://www.microsoft.com/download/en/details.aspx?displaylang=br&id=20163

References of EqualLogic S6000 http://www.equallogic.com/products/default.aspx?id=9511

This Blog was copied from:
http://msincic.wordpress.com/2011/07/03/what-is-it-and-how-to-calculate-iops-exchange-sql-sharepoint-others/

Practical tips on simplifying GPOs and OU organization

Practical tips on simplifying GPOs and OU organization

Summary: Deep paths within Active Directory can complicate Organizational Units and Group Policy Objects organization. IT pro Rick Vanover shares his approach to managing the complexity of GPOs.

One of the most powerful centralized administration tasks for Windows Servers and PCs is deploying Group Policy Objects (GPOs). So much so, in fact, that I could argue Group Policy is one of the best solutions Microsoft has ever provided.

While I’m very fond of GPOs and their flexibility to configure user and computer settings centrally, we can easily get out of control with conflicting rules and overly complicated implementations. I’m sure we’ve all seen a domain that has a very ugly configuration of GPOs, and let’s not even get started on the security groups.

In my Active Directory practice, I go back and forth in determining how deep the GPOs and Organizational Units (OUs) should go. I frequently don’t do more than three GPOs flowing in series with the OUs.

By series I mean one GPO in a parent OU and another GPO in a child OU, like Figure A where the green GPO applies to the parent OU and the red GPO applies to the child OU (as well as the green GPO).

Figure A

Click the image to enlarge.

OUs are great for granular classification of various Active Directory objects, though I don’t really have an incredible issue going very deep (within reason) in terms of levels for this configuration. GPOs, on the other hand, are not good candidates for multiple applications for each OU as the tree goes deeper.

It is too complicated to keep the configuration rules in mind for planning and quick thinking. To help simplify how GPOs are organized, here are some tips:

  • Leverage GPO filtering by security group to make more GPOs at a higher OU instead of more GPOs in deeper OUs
  • Never add individual users or computer accounts (always use the group trick above)
  • Combine user and computer settings by role, rather than separate GPOs
  • Self-document the names of the GPOs to be intuitive to the role and location
  • Use a consistent GPO nomenclature, including renaming GPOs to get there
  • Scour around for GPOs that have one setting and consolidate it with other GPOs

GPOs are great, but the tools require organization and thought. These tips are general guidelines, and any of you keeping score will note that my screenshot from my personal lab is not exactly following all of these recommendations. It’s fine for a lab, but in production, that’s a different story.

Rick Vanover is an IT infrastructure manager for a financial services organization in Columbus, Ohio. He has years of IT experience and focuses on virtualization, Windows-based server administration and system hardware.

Also read:

 

Configure server, desktop, laptop power options with Group Policy

Disable Server Manager and Initial Configuration Tasks via Group Policy

VHDX

With Windows Server 2012 Microsoft released a new Virtual Disk Format called VHDX. VHDX improves the Virtual Disk in a lot of way.

Back in October I wrote a blog post on the improvements of the VHDX Format in the Windows Server 8 Developer Preview. Back then VHDX supported a size of 16TB, with the release of the Windows Server 8 Beta (Windows Server 2012 beta) the new Maximum size changed to 64TB.

Some of the VHDX improvements:

  • Support up to 64TB size
  • Supports larger block file size
  • improved performance
  • improved corruption resistance
  • the possibility to add meta data

You can download the VHDX Format Specification.

To use this new features you have to convert your existing VHDs into the new VHDX format. You can this do in two different ways, with the Hyper-V Manager or with Windows PowerShell.

Convert VHD to VHDX via Windows PowerShell

To convert a VHD to a VHDX with Windows PowerShell you can use simple this PowerShell command:

1
 Convert-VHD TestVHD.vhd -VHDFormat VHDX -DestinationPath C:\temp\VHDs\TestVHDX.vhdx -DeleteSource

Of course you can convert the VHDX back to a VHD using the following command:

1
 Convert-VHD TestVHDX.vhdx -VHDFormat VHD -DestinationPath C:\temp\VHDs\TestVHD.vhd -DeleteSource

Convert VHD to VHDX via PowerShell

Convert VHD to VHDX via Hyper-V Manager

  1. Start the Hyper-V Manager and click on “Edit Disk…
    Hyper-V Manager
  2. Now select the VHD you want to convert
    Edit Virtual Hard Disk
  3. Select “Convert
    Convert Virtual Hard Disk
  4. Select the target format in this case VHDX
    Convert VHD to VHDX
  5. Select the new location for your new VHDX
    Convert VHD to VHDX Location
  6. Check the summary and click finish
    Convert VHD to VHDX Finish

Same as with the PowerShell command, you can also convert a VHDX to a VHD. But you have to make sure that the VHDX is not bigger than 2TB.

Aviraj Ajgekar already did a post on this TechNet blog about how you can convert a VHD to VHDX via Hyper-V Manager.

Export-CSV -Append #powershell

Export-CSV -Append

 

Here’s the solution for those who need to append rows to existing CSV files (andcannot do that): I have just used PowerShell 2.0 code snippets to create a proxy cmdlet – function which wraps standard Export-CSV cmdlet but adds handling of the-Append parameter.

So you can do something like:

Get-Process | Export-Csv -Path 'c:\Temp\processes.csv' -Append -Delimiter ';'

As you can see, other parameters – such as Delimiter – still function as well. If the file does not exist – the cmdlet will essentially ignore -Append and create the file as normal. If you specify -Append and the file is present, the function will turn the objects into CSV strings, remove the first row with the property names and append to the existing file.

For your convenience, I have posted the source code to PoshCode. Here’s it is as well for those interested:

#Requires -Version 2.0

<#
  This Export-CSV behaves exactly like native Export-CSV
  However it has one optional switch -Append
  Which lets you append new data to existing CSV file: e.g.
  Get-Process | Select ProcessName, CPU | Export-CSV processes.csv -Append
  
  For details, see

http://dmitrysotnikov.wordpress.com/2010/01/19/export-csv-append/

  (c) Dmitry Sotnikov  
#>

function Export-CSV {
[CmdletBinding(DefaultParameterSetName='Delimiter',
  SupportsShouldProcess=$true, ConfirmImpact='Medium')]
param(
 [Parameter(Mandatory=$true, ValueFromPipeline=$true,
           ValueFromPipelineByPropertyName=$true)]
 [System.Management.Automation.PSObject]
 ${InputObject},

 [Parameter(Mandatory=$true, Position=0)]
 [Alias('PSPath')]
 [System.String]
 ${Path},
 
 #region -Append (added by Dmitry Sotnikov)
 [Switch]
 ${Append},
 #endregion 

 [Switch]
 ${Force},

 [Switch]
 ${NoClobber},

 [ValidateSet('Unicode','UTF7','UTF8','ASCII','UTF32',
                  'BigEndianUnicode','Default','OEM')]
 [System.String]
 ${Encoding},

 [Parameter(ParameterSetName='Delimiter', Position=1)]
 [ValidateNotNull()]
 [System.Char]
 ${Delimiter},

 [Parameter(ParameterSetName='UseCulture')]
 [Switch]
 ${UseCulture},

 [Alias('NTI')]
 [Switch]
 ${NoTypeInformation})

begin
{
 # This variable will tell us whether we actually need to append
 # to existing file
 $AppendMode = $false
 
 try {
  $outBuffer = $null
  if ($PSBoundParameters.TryGetValue('OutBuffer', [ref]$outBuffer))
  {
      $PSBoundParameters['OutBuffer'] = 1
  }
  $wrappedCmd = $ExecutionContext.InvokeCommand.GetCommand('Export-Csv',
    [System.Management.Automation.CommandTypes]::Cmdlet)
        
        
 #String variable to become the target command line
 $scriptCmdPipeline = ''

 # Add new parameter handling
 #region Dmitry: Process and remove the Append parameter if it is present
 if ($Append) {
  
  $PSBoundParameters.Remove('Append') | Out-Null
    
  if ($Path) {
   if (Test-Path $Path) {        
    # Need to construct new command line
    $AppendMode = $true
    
    if ($Encoding.Length -eq 0) {
     # ASCII is default encoding for Export-CSV
     $Encoding = 'ASCII'
    }
    
    # For Append we use ConvertTo-CSV instead of Export
    $scriptCmdPipeline += 'ConvertTo-Csv -NoTypeInformation '
    
    # Inherit other CSV convertion parameters
    if ( $UseCulture ) {
     $scriptCmdPipeline += ' -UseCulture '
    }
    if ( $Delimiter ) {
     $scriptCmdPipeline += " -Delimiter '$Delimiter' "
    } 
    
    # Skip the first line (the one with the property names) 
    $scriptCmdPipeline += ' | Foreach-Object {$start=$true}'
    $scriptCmdPipeline += '{if ($start) {$start=$false} else {$_}} '
    
    # Add file output
    $scriptCmdPipeline += " | Out-File -FilePath '$Path'"
    $scriptCmdPipeline += " -Encoding '$Encoding' -Append "
    
    if ($Force) {
     $scriptCmdPipeline += ' -Force'
    }

    if ($NoClobber) {
     $scriptCmdPipeline += ' -NoClobber'
    }   
   }
  }
 } 
  

  
 $scriptCmd = {& $wrappedCmd @PSBoundParameters }
 
 if ( $AppendMode ) {
  # redefine command line
  $scriptCmd = $ExecutionContext.InvokeCommand.NewScriptBlock(
      $scriptCmdPipeline
    )
 } else {
  # execute Export-CSV as we got it because
  # either -Append is missing or file does not exist
  $scriptCmd = $ExecutionContext.InvokeCommand.NewScriptBlock(
      [string]$scriptCmd
    )
 }

 # standard pipeline initialization
 $steppablePipeline = $scriptCmd.GetSteppablePipeline(
        $myInvocation.CommandOrigin)
 $steppablePipeline.Begin($PSCmdlet)
 
 } catch {
   throw
 }
    
}

process
{
  try {
      $steppablePipeline.Process($_)
  } catch {
      throw
  }
}

end
{
  try {
      $steppablePipeline.End()
  } catch {
      throw
  }
}
<#

.ForwardHelpTargetName Export-Csv
.ForwardHelpCategory Cmdlet

#>

}

Hope this helps! ;)

 

 

20 Responses to “Export-CSV -Append”

  1. Jason ArcherJanuary 19, 2010 at 10:25 pm

    Reader beware, you can have some problems if you append a different kind of object (or number of columns) than what is already in the file.

    • Dmitry SotnikovJanuary 20, 2010 at 7:02 am

      Yes, good point Jason! In my code I am NOT checking whether the property names are the same. Or that delimiter is the same. Or that encoding is the same. And so on. It is up to user to make sure that appended data matches the original one.

      • TimAugust 4, 2011 at 2:50 pm

        How exactly would one go about formatting data to match the original? For example:

        Data in csv is from SMO dataFile object containing server, dbname, size, usedSpace, freespace.

        I want to append a TOTAL row for each instance after all of the databases have been appended. I have tried building my own object/string and appending it to the csv, but I can’t seem to get it right. Any help would be much appreciated.

        Here is what I have tried:
        $totalString = “@{Server=$servername; dbname=$total; Size=$totalsize; UsedSpace=$totalUsed; FreeSpace=$totalFree}”;
        (above, with and without, @, {}, and quotes)

        $totalString = @”
        Server : $servername
        dbname : $total
        Size : $totalsize
        UsedSpace : $totalUsed
        FreeSpace : $totalFree
        “@

        $totalstring | export-csv c:\test.csv -append;

        Thanks,
        TIm

  2. John HuberFebruary 4, 2011 at 4:49 pm

    Thanks, I am an admin, who is not a good coder. This is a great help for me and makes my life a lot easier.

    Many Thanks

  3. Alvin MagpayoApril 7, 2011 at 11:07 pm

    Hi,
    Im quite new with powershell and this post regarding Export-CSV with append is very helpful to complete my automated script reports.

    I just don’t know how to use it. What do you mean with “Add this function to your profile to make Export-CSV cmdlet handle -Append parameter” in the poshcode.org?

    I just need to make it work and use the -Append of Export-CSV and time is killing me due to my deadline.

    Thanks in advance.

  4. Alvin MagpayoApril 9, 2011 at 8:09 am

    Thanks. It worked! I was able to do it and had the -Append on Export-CSV.

    One more question though, What if I want to make sure that this profile will be used all the time? My script will run as a scheduled job and placed on different server.

    Is there a way that I can call the same profile just to make sure that the code will be executed, or is it possible that I can include that at the top of the code as one of the validation process?

    Thanks for the help. :D

    Alvin.

  5. Alvin MagpayoApril 9, 2011 at 12:20 pm

    ei,
    I got it working. Thanks. Appreciate the help. :)

    Alvin

  6. JamieMay 5, 2011 at 9:33 pm

    I am having trouble using this script. When I use the command, I get an error saying “The varialbe ‘$UseCulture’ cannot be retrieved because it has not been set”. The error references line 96 and char 21. Please help.

    Thanks,
    Jamie

    • Dmitry SotnikovMay 5, 2011 at 11:02 pm

      Jamie,

      Looks like you have strict mode on, and this somehow does not like the parameter declarations of the function.

      One easy workaround that I see is to simply add Set-StrictMode -Off to the biginning of the script… If this is unacceptable, you’ll need to figure out how to make the PowerShell parser consider the parameters ok to use in the script – let me know if you manage to do that.

      Dmitry

  7. Chad MillerMay 21, 2011 at 2:10 am

    Dimitry

    I’m wondering if you can help me understand this piece of code in from your script (modified slightly to illustrate an example):

    get-psdrive | foreach-object {$start=$true} {if ($start) {$start=$false} else {$_}}

    I understand the point–skip first row and it works perfectly. I’ve seen scriptblocks before, but this one looks odd.

    The first block is the foreach-object block and the second block, well I don’t understand how the pipeline is able to continue to pipe objects.

  8. Chad MillerMay 21, 2011 at 8:11 pm

    My previous question was answered via Twitter by Bartek

    “Looks like -begin and -process blocks. So the {$start = $true} is run only once, before obj arrive”

    Sure enough get-help foreach-object mentions these params, but the documentation incorrectly states they are named params.

  9. JCochranDecember 8, 2011 at 9:12 pm

    Am I missing something? While I greatly appreciate this code, and I could not code myself out of a brown paper bag, isn’t this an excessive amount of code to simply append a file? I mean that with all due respect and I am more than likely missing something, but this seems counter to the simplicity of the intended nature of PowerShell. I am one person that is finding PowerShell to be powerful, but not all that simple. I rarely have time to code and read and it seems like a significant amount of time is required to learn PowerShell to this degree. Maybe I’m dumb? I dunno… :p

  10. Daniel RymerJanuary 25, 2012 at 2:27 pm

    Code works great. I tried writing my own using 2 additional temp files, which was far shorter, but kept ending up with blank lines I couldn’t seem to remove. I used this code and the file came out perfect! Thanks!

PowerShell script in a .bat file

PowerShell script in a .bat file

How do you put PowerShell code into a batch/cmd file without having to also have a separate .ps1 file? (If you can have an external .ps1 file – you just invoke powershell.exe and supply the path to the script file as parameter.)

I got this question recently from one of our field guys and thought I would blog about the trick publicly.

The problem is that PowerShell syntax can obviously have elements that .bat files cannot stand, and that you cannot pass multiline script as powershell.exe parameter.

There are actually a couple of ways to do so:

1. Encode the script:

As Bruce points here PowerShell has the -EncodedCommand parameter, which lets you pass any PowerShell code as base-64-encoded string.

So if you have some sort of script, like this:

#iterate numbers 1 through 10
1..10 | foreach-object {
# just output them
"Current output:"
$_
}

You simply (in PowerShell command-line or script) put it in curcly brackets and assign (as scriptblock) to a variable:

$code = {
    #iterate numbers 1 through 10
    1..10 | foreach-object {
    # just output them
    "Current output:"
    $_
    }
}

Then use this PowerShell command to get the encoded version:

[convert]::ToBase64String([Text.Encoding]::Unicode.GetBytes($code))

Then copy/paste the output of the command to your batch file:

powershell.exe -EncodedCommand DQAKAAkAIwBpAHQAZQByAGEAdABlACAAbgB1AG0AYgBlAHIAcwAgADEAIAB0AGgAcgBvAHUAZwBoACAAMQAwAA0ACgAJADEALgAuADEAMAAgAHwAIABmAG8AcgBlAGEAYwBoAC0AbwBiAGoAZQBjAHQAIAB7AA0ACgAJACMAIABqAHUAcwB0ACAAbwB1AHQAcAB1AHQAIAB0AG
gAZQBtAA0ACgAJACIAQwB1AHIAcgBlAG4AdAAgAG8AdQB0AHAAdQB0ADoAIgANAAoACQAkAF8ADQAKAAkAfQANAAoA

2. Keep the code as PowerShell but turn it to a string:

If the first approach for whatever reason does not work for you (e.g. you care about readability), you can try to flatten the script and pass it as a string:

  1. Take the PowerShell script.
  2. Remove all the comments ( everything that starts with #).
  3. Put ; at the end of each line.
  4. Remove all line breaks.
  5. Supply the string you get as the -command parameter for powershell.exe.

The reason for all of this is that powershell.exe (the executable which allows you to run any PowerShell code allows you to either start an external .ps1 script file (which often creates additional complexity of having to maintain and ship 2 files) or execute a single line of PowerShell code as the -command parameter. Hence the requirement to flatten the script and turn something like this:

#iterate numbers 1 through 10
1..10 | foreach-object {
# just output them
"Current output:"
$_
}

into:

powershell.exe -command '1..10 | foreach-object { "Current output:"; $_; }'

 

See also this blog post by MoW on making PowerShell pass its exit code to command files.

How to Schedule a PowerShell Script

 

How to Schedule a PowerShell Script

..assuming that you are running PowerShell 2.0..

1. Get your script ready

Surprising as it might sound, your script might actually not be ready to run in a scheduled task as is. This happens if it uses cmdlets from a particular PowerShell module or snapin, and it worked for you interactively because you used a specialized shell (e.g. Exchange Management Shell) or a tool like PowerGUI Script Editor which loads the modules for you.

If you indeed are using using any non-default cmdlets, simply add Add-PSSnapin or Import-Module to the beginning of the script. For example:

Add-PSSnapin Quest.ActiveRoles.ADManagement

2. Schedule the task

To schedule a task simply start Windows Task Scheduler and schedule powershell.exe executable passing the script execution command as a parameter. The -Fileparameter is the default one so simply specifying the script path as the argument would work in a lot of cases:

You can find powershell.exe in your system32\WindowsPowerShell\v1.0 folder.

4. Report task success or failure

If you want your script to report success or failure (or some sort of other numerical result) simply use the exit keyword in the script to pass the value, e.g.:

exit 4

Then your Windows Task Scheduler will show the value in the Last Run Result (you might need to hit F5 to refresh the column in the task scheduler):

3. Passing parameters

If you need to pass parameters things get a little trickier. Say, you have a script which adds two numbers:

param($a=2, $b=2)
"Advanced calculations ahead"
exit $a + $b

To pass the numbers as parameters, you would want to use powershell.exe -Command instead of powershell.exe -File. This -Command argument will then have the script invocation operator &, path to the script, and the parameters. E.g.:

ProgramC:\Windows\System32\WindowsPowerShell\v1.0\powershell.exe
Add argument (optional)-Command "& c:\scripts\hello.ps1 -a 2 -b 3"

If you want to also get your exit code from the script, you would need to re-transmit that by adding exit $LASTEXITCODE to the command (I learnt this tip from MoW). E.g.

ProgramC:\Windows\System32\WindowsPowerShell\v1.0\powershell.exe
Add argument (optional)-Command "& c:\scripts\hello.ps1 -a 2 -b 3; exit $LASTEXITCODE"

5. Run x86 PowerShell on x64 Windows

On 64-bit versions of Windows you actually have both 64-bit and 32-bit versions of PowerShell. In most cases you don’t care but in some cases (e.g. specific COM objects being used) you might need specifically a 32-bit version. To get that to run, simply pick the proper executable when you schedule the task:

Regular PowerShell (64-bit version on 64-bit Windows):%SystemRoot%\system32\WindowsPowerShell\v1.0\powershell.exe

32-bit PowerShell (x86):%SystemRoot%\syswow64\WindowsPowerShell\v1.0\powershell.exe

6. Other options

To learn about all parameters PowerShell executable has simply run it with /? option (from either cmd.exe or a PowerShell session).

I normally use -noprofile to make sure that nothing in the PowerShell profile interferes with the task.

Also, if your Execution Policy does not allow running scripts the -ExecutionPolicy parameter comes handy allowing you to make an exception just for this task. E.g.:

c:\Windows\System32\WindowsPowerShell\v1.0\powershell.exe -NoProfile -File c:\scripts\hello.ps1 -ExecutionPolicy RemoteSigned

Here are some other parameters provided by PowerShell:

-PSConsoleFile
    Loads the specified Windows PowerShell console file. To create a console
    file, use Export-Console in Windows PowerShell.

I guess you could use that is you want the exact environment you have in the predefined shell from Exchange, AD, or SQL. E.g.: PowerShell -PSConsoleFile SqlSnapIn.Psc1

-Version
    Starts the specified version of Windows PowerShell.

I don’t think this one actually works.

-NoLogo
Hides the copyright banner at startup.

Not really relevant for scheduled tasks, imho…

-NoExit
    Does not exit after running startup commands.

Might be useful for troubleshooting.

-Sta
    Start the shell using a single-threaded apartment.

If your script needs STA mode (if you don’t know what this is – most likely you don’t need this. ;) )

-NonInteractive
    Does not present an interactive prompt to the user.

Not really relevant for scheduled tasks, imho…

-InputFormat
    Describes the format of data sent to Windows PowerShell. Valid values are
    "Text" (text strings) or "XML" (serialized CLIXML format).

-OutputFormat
    Determines how output from Windows PowerShell is formatted. Valid values
    are "Text" (text strings) or "XML" (serialized CLIXML format).

Not sure how I would use those… Here’s one example of how -InputFormat nonecan help fix issues with PowerShell becoming unresponsive when waiting for input.

-WindowStyle
    Sets the window style to Normal, Minimized, Maximized or Hidden.

Unfortunately, I could not make this work. I tried to use -WindowStyle Hidden to avoid the PowerShell console window popping up during task execution but with no luck.

-EncodedCommand
    Accepts a base-64-encoded string version of a command. Use this parameter
    to submit commands to Windows PowerShell that require complex quotation
    marks or curly braces.

    # To use the -EncodedCommand parameter:
    $command = 'dir "c:\program files" '
    $bytes = [System.Text.Encoding]::Unicode.GetBytes($command)
    $encodedCommand = [Convert]::ToBase64String($bytes)
    powershell.exe -encodedCommand $encodedCommand

Can be useful when having to pass advanced expressions and getting issues with parser.

 

 

 

Here are a few more articles:

http://www.searchmarked.com/windows/how-to-schedule-a-windows-powershell-script.php

http://exchangeshare.wordpress.com/2008/12/08/how-to-schedule-powershell-script-for-an-exchange-task/

http://www.vistax64.com/powershell/23314-how-schedule-powershell-script.html

http://www.windowsitpro.com/blogs/PowerShellwithaPurpose/tabid/2248/entryid/72376/How-to-Schedule-PowerShell-Scripts.aspx