PowerShell, Web API’s and the New-WebServiceProxy (Part 1)

There are plenty of articles out there on the basics of using API’s with PowerShell’s great New-WebServiceProxy cmdlet that will get you started with some basic API’s, but far less posts that actually explore how to get past the simplest of queries. Once I’d gotten past the basics of connecting and requesting some data I found the information dried up pretty quickly and typically just short of achieving what I’d like to do with the API’s most of the time so after a lot of banging my head against the wall, I figured I’d post some useful findings.

Two Different Authentication Scenarios

Authentication will take on a number of forms depending on your target server. If you’re lucky, you’ll be able to use the native functionality:

$MyCredentials = Get-Credential
$myURL = "https://myapi.myapiserver.com"
$myAPI = New-WebServiceProxy -Uri $MyURL -Credential $MyCredentials

However, what about authentication that accepts the username and password as parameters – well that’s pretty simple – we can just pass our username and password to the API service that does the login – e.g.

$myURL = "https://myapi.myapiserver.com"
$myAPI = New-WebServiceProxy -Uri $MyURL
$token = $MyAPI.Login($credsUserName,$credsPassword)

and store the resulting API token for future authentication – simple!

OK, you think, so I can just do:

$myURL = "https://myapi.myapiserver.com"
$MyCredentials = Get-Credential
$myAPI = New-WebServiceProxy -Uri $MyURL
$token = $MyAPI.Login($MyCredentials.Username,$MyCredentials.Password)

Well, maybe not – you’ll probably find you’ll get a login error and it’ll suggest you’ve simply got bad credentials. Well, not exactly- it’s because it’s passing the Password as a Secure String – you’ll need to pass this as a normal value the API can process:

$myURL = "https://myapi.myapiserver.com"
$MyCredentials = Get-Credential
$myAPI = New-WebServiceProxy -Uri $MyURL
$token = $MyAPI.Login($MyCredentials.Username,$MyCredentials.GetNetworkCredential().password)

So now you’ve got two different authentication methods up your sleeve. Let’s move on to something more interesting…

Sending data with AutogeneratedTypes

OK, the next common scenario I’ve seen people be caught out by is trying to submit data to the API. If you do a get command on most API’s you’ll get back a nice array of objects of an autogenerated type based on the WebService Proxy. PowerShell is trying to be helpful with these types but it’s not immediately obvious how to leverage this.

For the rest of this example I’m going to use an API I’ve being working with a lot recently- the Tripwire Configuration Compliance Manager (CCM) API – I’m hoping the examples here make sense, but feel free to add a comment if not!

So I’ve connected to my API ok and I’ve got a token – how do I get some data out of the API? Well, if I want to request the Network Profiles I can run the GetNetworkProfiles call. But how do I know this exists in the API or what calls are useful? Well, I could’ve browsed the API documentation for 1, or visited the webservice API URI which would show the calls. But my preferred method (since I’m already in PowerShell) is to use the auto-complete in PowerShell to tell me the API methods available for me – the PowerShell ISE will nicely list any methods available to you. So, if I have

# ... 
# Set up API WebService Proxy
$CCMservice = New-WebServiceProxy -Uri $uri
# Get a token to use going forward...
$token = $CCMservice.Login($credsUserName,$credsPassword.GetNetworkCredential().password)

And then enter $ccmservice. I can tab complete my way through the methods. That means I can do:

# ... 
# Set up API WebService Proxy
$CCMservice = New-WebServiceProxy -Uri $uri
# Get a token to use going forward...
$token = $CCMservice.Login($credsUserName,$credsPassword.GetNetworkCredential().password)
# Get a list of network profiles from CCM
$CCMprofiles = $CCMservice.GetNetworkProfiles($token)

to store a list of profiles in $ccmprofiles- here’s the neat list of Network Profiles:

PROMPT# $CCMprofiles = $CCMservice.GetNetworkProfiles($token)
PROMPT# $CCMprofiles

ScanState : Stopped
InsertActive : True
InsertPassive : False
DefaultPingIntervalInMinutes : -1
IsPendingDelete : False
ProfileName : NetworkProfileA
ScanEngineID : 7
VulnScanProfileID : -1
IsOpenProfile : False
IsPassiveOnly : False
HasActiveTask : True
NetworkProfileGroupId : 1
ID : 6
IsPersisted : True
IsDirty : False

ScanState : Stopped
InsertActive : True
InsertPassive : False
DefaultPingIntervalInMinutes : -1
IsPendingDelete : False
ProfileName : NetworkProfileB
ScanEngineID : 5
VulnScanProfileID : -1
IsOpenProfile : False
IsPassiveOnly : False
HasActiveTask : True
NetworkProfileGroupId : 1
ID : 7
IsPersisted : True
IsDirty : False

Good start. I can already do useful things with this like $ccmprofiles.count to see how many network profiles I retrieved, list all the profile names with $CCMprofiles.Profilename, or use a where-object filter to find a list of profiles where scanning is stopped:  $CCMprofiles | Where-Object {$_.scanstate -eq “Stopped”}

I can also find out a bit more about what PowerShell has created – if I do

$CCMprofiles.GetType()

I see that I’ve got a custom object type of NetworkProfile[]

So I can probably be pretty certain that if I want to create a new network profile I’ll need to send a NetworkProfile object to the API- how do I create one of those? Well, let’s start by exploring what PowerShell’s created when I connect to the API. I can run:

$CCMservice.GetType().Assembly.GetExportedTypes()

To get a list of all the autogenerated types from the API – this handily includes the NetworkProfile type- let’s filter that big list of functions down to just the one named after the object we retrieved earlier:

 $CCMservice.GetType().Assembly.GetExportedTypes() | where {$_.name -eq 'NetworkProfile'} | select -first 1

What we’re getting back here is an object that the API knows about as a network profile. Now we know the object type exists, we can create a new object based on this template with right?

$NewProfile= New-Object “NetworkProfile”

No… Because we need to use the full name – NetworkProfile is just a nice, friendly short name we got back. But we haven’t see a fullname anywhere, right? Well, that’s because we were only looking at some attributes of the ExportedTypes earlier- if we re-run the GetExportedTypes command but this time with a Format-List we’ll see a whole heap more:

PROMPT# $CCMservice.GetType().Assembly.GetExportedTypes() | where {$_.name -eq 'NetworkProfile'} | select -first 1 | fl

Module : mpc1vvfi.dll
Assembly : mpc1vvfi, Version=0.0.0.0, Culture=neutral, 
 PublicKeyToken=null
TypeHandle : System.RuntimeTypeHandle
DeclaringMethod : 
BaseType : Microsoft.PowerShell.Commands.NewWebserviceProxy.A
 utogeneratedTypes.WebServiceProxy1rcleWeb_WebClien
 tAPI_asmx_wsdl.NetworkProfileAuto
UnderlyingSystemType : Microsoft.PowerShell.Commands.NewWebserviceProxy.A
 utogeneratedTypes.WebServiceProxy1rcleWeb_WebClien
 tAPI_asmx_wsdl.NetworkProfile
FullName : Microsoft.PowerShell.Commands.NewWebserviceProxy.A
 utogeneratedTypes.WebServiceProxy1rcleWeb_WebClien
 tAPI_asmx_wsdl.NetworkProfile
AssemblyQualifiedName : Microsoft.PowerShell.Commands.NewWebserviceProxy.A
 utogeneratedTypes.WebServiceProxy1rcleWeb_WebClien
 tAPI_asmx_wsdl.NetworkProfile, mpc1vvfi, 
 Version=0.0.0.0, Culture=neutral, 
 PublicKeyToken=null
Namespace : Microsoft.PowerShell.Commands.NewWebserviceProxy.A
 utogeneratedTypes.WebServiceProxy1rcleWeb_WebClien
 tAPI_asmx_wsdl
GUID : 00416640-ec3e-3330-9344-3b9547e2c1d3
IsEnum : False
GenericParameterAttributes : 
IsSecurityCritical : True
IsSecuritySafeCritical : False
IsSecurityTransparent : False
IsGenericTypeDefinition : False
IsGenericParameter : False
GenericParameterPosition : 
IsGenericType : False
IsConstructedGenericType : False
ContainsGenericParameters : False
StructLayoutAttribute : System.Runtime.InteropServices.StructLayoutAttribu
 te
Name : NetworkProfile
MemberType : TypeInfo
DeclaringType : 
ReflectedType : 
MetadataToken : 33554520
GenericTypeParameters : {}
DeclaredConstructors : {Void .ctor()}
DeclaredEvents : {}
DeclaredFields : {scanStateField, insertActiveField, 
 insertPassiveField, 
 defaultPingIntervalInMinutesField}
DeclaredMembers : {Microsoft.PowerShell.Commands.NewWebserviceProxy.
 AutogeneratedTypes.WebServiceProxy1rcleWeb_WebClie
 ntAPI_asmx_wsdl.ScanningState get_ScanState(), 
 Void set_ScanState(Microsoft.PowerShell.Commands.N
 ewWebserviceProxy.AutogeneratedTypes.WebServicePro
 xy1rcleWeb_WebClientAPI_asmx_wsdl.ScanningState), 
 Boolean get_InsertActive(), Void 
 set_InsertActive(Boolean)...}
DeclaredMethods : {Microsoft.PowerShell.Commands.NewWebserviceProxy.
 AutogeneratedTypes.WebServiceProxy1rcleWeb_WebClie
 ntAPI_asmx_wsdl.ScanningState get_ScanState(), 
 Void set_ScanState(Microsoft.PowerShell.Commands.N
 ewWebserviceProxy.AutogeneratedTypes.WebServicePro
 xy1rcleWeb_WebClientAPI_asmx_wsdl.ScanningState), 
 Boolean get_InsertActive(), Void 
 set_InsertActive(Boolean)...}
DeclaredNestedTypes : {}
DeclaredProperties : {Microsoft.PowerShell.Commands.NewWebserviceProxy.
 AutogeneratedTypes.WebServiceProxy1rcleWeb_WebClie
 ntAPI_asmx_wsdl.ScanningState ScanState, Boolean 
 InsertActive, Boolean InsertPassive, Int32 
 DefaultPingIntervalInMinutes}
ImplementedInterfaces : {}
TypeInitializer :

There’s actually more – but for now let’s focus on what we need to do to create an object of the type we want – to do this we’ll need the FULL NAME. Great – that’s simply (!):

Microsoft.PowerShell.Commands.NewWebserviceProxy.AutogeneratedTypes.WebServiceProxy1rcleWeb_WebClientAPI_asmx_wsdl.NetworkProfile

Catchy? Well you can make you’re life easier with the commands we noted earlier-

$myNewNetworkProfileObj = $CCMservice.GetType().Assembly.GetExportedTypes() | where {$_.name -eq 'NetworkProfile'} | select -first 1 | select fullname
$myNewNetworkProfileObj = $myNewNetworkProfile.FullName

Will retrieve the exported type with the name in our where clause in the first line and store it in the myNewNetworkProfile- since we only need the full name for our purposes though, in the second line I store the Fullname only (I could equally address this in future just by using the $myNewNetworkProfile.FullName but I’m lazy and it makes it harder if I later want to print this attribute using Write-host etc!).

Cool – so let’s create our new network profile we want to add:

PROMPT# $ProfileToAdd = New-Object $myNewNetworkProfileObj

PROMPT# $ProfileToAdd

ScanState : Running
InsertActive : True
InsertPassive : False
DefaultPingIntervalInMinutes : 0
IsPendingDelete : False
ProfileName : 
ScanEngineID : 0
VulnScanProfileID : 0
IsOpenProfile : False
IsPassiveOnly : False
HasActiveTask : False
NetworkProfileGroupId : 0
ID : 0
IsPersisted : False
IsDirty : False

Cool – I can now set things like the profile name by simply doing

$ProfileToAdd.ProfileName = "My New Network Profile!"

A good start but we’re not quite ready to send this to the API. Next time we can explore why not and how to actually post the data but that’s us for now!

Advertisement

Bonus SQL Pivot (% totals)

Probably something you’ll want to do is summarise the pivot tables I noted earlier. That’s not too hard – but you’ll need to add an additional column to do you summing on (I reused IP since my table had no ID’s to count):

SELECT r2.IP,     
[Passed] AS Pass,     
[Failed] as Fail, 
ROUND(([Passed] / CAST(([Passed] + [Failed]) AS REAL)) * 100, 0) AS PercentPassed, 
   ROUND(([Failed] / CAST(([Passed] + [Failed]) AS REAL)) * 100, 0) AS PercentFailed 
FROM    (SELECT IP As ID, IP, [TestResult] FROM #temptable1) r1
PIVOT (Count(ID) FOR [TestResult] IN ([Passed], [Failed])) AS r2

Just want the percent? Just drop the pass and fail columns:

SELECT r2.IP,
ROUND(([Passed] / CAST(([Passed] + [Failed]) AS REAL)) * 100, 0) AS PercentPassed,    
ROUND(([Failed] / CAST(([Passed] + [Failed]) AS REAL)) * 100, 0) AS PercentFailed 
FROM    (SELECT IP As ID, IP, [TestResult] FROM #temptable1) r1
PIVOT (Count(ID) FOR [TestResult] IN ([Passed], [Failed])) AS r2

Finally, maybe you want to include Inconclusive results as failures for the purposes of summarising- in which case you’ll end up with:

SELECT a2.IP,     
[Passed] AS Pass,     
[Failed] as Fail, [Inconclusive] as Inconclusive,    
ROUND(([Passed] / CAST(([Passed] + ([Failed]+[Inconclusive])) AS REAL)) * 100, 0) AS PercentPassed,    
ROUND((([Failed] + [Inconclusive]) / CAST(([Passed] + ([Failed]+[Inconclusive])) AS REAL)) * 100, 0) AS PercentFailed 
FROM    (SELECT IP As ID, IP, [TestResult] FROM #temptable1) a1 -- We need to have a unique ID value to count, but we'll just use IP again!
PIVOT (Count(ID) FOR [TestResult] IN ([Passed], [Failed], [Inconclusive])) AS a2

SQL Pivots for the rest of us

Every time I have to build a SQL pivot I can’t remember how to do it and end up searching online and struggle to work out what the heck people are talking about in their examples and spend ages looking at “sales” data examples, etc. that are overly complicated for my simple pivot purposes so I’m posting this as an aide-memoire…

Do I need to PIVOT – or can I summarise in some other way

First of all, double check if you really need a pivot table! SQL has nice summary functions available to you in the form of group by – let’s assume you have some data that looks like (in a table we’ll call #temptable1):

IP TestResult
10.1.1.1 Pass
10.1.1.1 Pass
10.1.1.1 Fail
10.1.1.1 Inconclusive
10.1.1.2 Pass
10.1.1.2 Fail
10.1.1.2 Pass

Etc.

If I want a simple summary table I can do:

 select COUNT(IP) as Devices, [TestResult]
 from #temptable1
 Group by [TestResult]

to give me:

Devices TestResult
4 Pass
2 Fail
1 Inconclusive

This gives us a good high level summary by aggregating (counting/summing the device count) – but not a lot of use if I want to get details for a specific IP address.

We could do:

 select ip, [Test State], COUNT(IP) as Total
 from #temptable1
 Group by ip, [Test State]

To give us

IP TestResult Total
101.1.1 Fail 1
10.1.1.1 Inconclusive 1
10.1.1.1 Pass 2
10.1.1.2 Fail 1
10.1.1.2 Pass 2

…and we can even filter by IP:

 select ip, [Test State], COUNT(IP) as Total
 from #temptable1
 where ip = '10.1.1.1'
 Group by ip, [Test State]

To show a single host:

IP TestResult Total
10.1.1.1 Failed 1
10.1.1.1 Inconclusive 1
10.1.1.1 Passed 2

But there are some limits here- for example, we might want to include an inconclusive score of “0” – in this summary if an IP has no inconclusive it simply won’t show.

…A nasty way to brute force a solution…

OK, so we want to see the type of data we want- we could do a series of selects to give us the results we want:

select Count(#temptable1.[TestResult]) As Pass, Ip from #temptable1 
where [TestResult] = 'Pass'
and ip = '10.1.1.1'
group by IP

select Count(#temptable1.[TestResult]) As Inconclusive, Ip from #temptable1 
where [TestResult] = 'Inconclusive'
and ip = '10.1.1.1'
group by IP

select Count(#temptable1.[TestResult]) As Inconclusive, Ip from #temptable1 
where [TestResult] = 'Fail'
and ip = '10.1.1.1'
group by IP

select Count(#temptable1.[TestResult]) as TotalTests, Ip from #temptable1 
where ip = '10.254.180.202'
group by IP;

That’ll give us a series of result tables showing Pass, Fail, and Inconclusive – but then we have a whole bunch of tables, rather than just one! In certain scenarios this will do just fine (and is perhaps easier to read than a pivot) but how can we aggregate these?

Well, we could brute force this now we have the select statements above:

Select (
select Count(#temptable1.[TestResult]) As Pass from #temptable1 
where [TestResult] = 'Pass'
and ip = '10.1.1.1'
group by IP ) as Passed,
(
select Count(#temptable1.[TestResult]) As Inconclusive from #temptable1 
where [TestResult] = 'Inconclusive'
and ip = '10.1.1.1'
group by IP) As Inconclusive,
(
select Count(#temptable1.[TestResult]) As Inconclusive from #temptable1 
where [TestResult] = 'Fail'
and ip = '10.1.1.1'
group by IP ) As Failed,
(
select Count(#temptable1.[TestResult]) as TotalTests from #temptable1 
where ip = '10.1.1.1'
group by IP) As Total

giving us

Passed Inconclusive Failed Total
2 1 1 4

But that’s seems like messy approach to our problem and not very scalable!

The real pivot…

So let’s define what how we want to summarise the data. In this case, let’s say the below table is the result we’d like:

IP Pass Fail Inconclusive
10.1.1.1 2 1 1
10.1.1.2 2 1 0

So we’ve PIVOTED the TestRESULT column and grouped by the IP address.

So we can start with a simple select statement laying out our table data including the IP column, the TestResult and a new column which we can SUM later (we’ll call this ResultSUM).

select *
from
(
  select IP, [TestResult], [TestResult] as ResultSUM -- The columns we want, plus a result we do our sum on
  from #temptable1
) src

But now we need to tell SQL what values we want and what the possible headers for our PIVOT will be and populate the values:

Count(ResultSUM) -- We have to do a count here
for [TestResult] in ([Failed], [Passed], [Inconclusive]) -- The possible PIVOT Headers

OK- that should be us. We need to tell SQL it’s a PIVOT too – so the final result should look something like this:

select *
from
(
  select IP, [TestResult], [TestResult] as Result -- The columns we want, plus a result we do our sum on
  from #temptable1
) src
pivot
(
  Count(ResultSUM) -- We have to do a count here
  for [Test State] in ([Failed], [Passed], [Inconclusive]) -- As the possible PIVOT Headers
) piv;

And that’s it!

Weekend Reads

I’ve been watching the Handmaid’s Tale with rapt attention – and the Guardian drops some hints of what it would like to see next season and what made this season work so well and it’s spot on with many of it’s observations but doesn’t give anywhere near enough credit to the cast (and, in particular, it’s leading lady) that manages to make this the most depressing but gripping tale I’ve seen on screen for some time. Game of Thrones is the only other show I watch that tries to explore “every day horror” (although Game of Throne’s era and fantasy elements may make the horrors of war seem like fantastical fiction, most of the horror of the show is, in fact, human decisions and the impact of war)- the rest of my TV I deliberately stick to is far happier, lighter stuff (Glow, You, Me, Her and Fargo are all on my playlist at the moment and are all excellent watches).

If you want something longer to watch – the answer is almost certainly not the Emjoi movie. The good news is that even although the movie itself sounds appalling, the reviews are immensely entertaining. It sounds like the Big Sick, instead, is the movie to watch. Unfortunately I’m traveling so will be away from the big screen for some time- hopefully I’ll catch this when it comes out on DVD.

For gaming, Sonic the Hedgehog is compared to… Lorde’s Green Light – and is spot on. Sonic is a classic game idea that sounds like it just shouldn’t work and yet, in a weird way, it just does. With reviews of Pyre coming in, I wonder if this article hints at the bigger problem with songs and games like this – how to subvert expectations and “the science” of good design and still rope players in – I’m intrigued to try Pyre all the more now.

On the geeky side, Secplicity has a breakdown of some Python code for managing your Watchguard firewalls (although they touch on the topic of severless I’ve mentioned before, there isn’t much in there (although the link to this intro to the topic is great)).

Tracking Jeff Bezo’s wealth graphically is interesting although it’s just as interesting to see the trajectory of Zuckerberg- although it’ll be interesting to see who’s legacy lasts the longest and has the greatest positive impact on the world in the long run. My Echo doesn’t get a lot of use these days except as a talking clock and light switch, but the idea and potential for these devices to me remains intriguing (I’ve started writing an Echo app and suspect (whilst it certainly won’t by my app!) there is potential for a killer app in this market space that will drive already OK sales for this field into the stratosphere).

Talking of audio gadgets, people are lamenting the demise of the traditional iPod  – I always coveted a hard-drive based iPod with Click Wheel but never got one – although I did succumb to the “smart watch like” model and a Touch (won in a competition)- although neither of these models were great and were crushed by the single purpose device that is the modern smartphone- I’m none to surprised these products are gone now, but it’s interesting to think how they helped open up the market for smart phones.

The card techniques demonstrated in this video of Franco Pascali are awesome- take 4 minutes out to watch cards move in ways you wouldn’t expect and listen to a emotional walk through of what “passion” and learning sounds like – it may sound like an advert at times (and the production method certainly lean in to this) but it’s still a neat video.

On the politics side – David Pell (you’ll see a big source of my daily reads) is spot on about the things we can learn in difficult times – namely, who we are and what we care about.

 

 

Tuesday reads

I found reading this piece on a gun crime victim’s family depressing but enlightening – and reminded me how fortunate I am to live in a country where guns are exceedingly rare. Contrast this with the struggles of a “Boy who loves too much” makes you realise it’s a complex world with so many moving parts that understanding the human mind only ever looks less possible over time which, in my mind, feels strange.

On the subject of the mind, the BBC has found a way to put me off doing DIY (even more) – studies reveal that outsourcing jobs you don’t enjoy brings more reward for your money than simply buying something nice. I think, deep down, I knew this already, but chose to do things I don’t enjoy as I see it as character building (hardship builds us up, outsourcing allows skills to lapse).

I wouldn’t normally comment on a “review” style piece, but the Verge has compared the MacBook and MacBook Pro (Entry level) and succeeded in saying what pretty much everyone thinks about laptops these days- as long as you don’t got cheap, you can probably get an OK laptop- but look for a GREAT laptop and it’s hard to find something that balances a power user’s needs (I know I struggled hunting for a laptop and accepted a lot of compromises in hte model I eventually went with).

 

Finally, what can I say about the VR whiskey experience – except I’d gladly give that a go…

Monday reads

Game developers talk about bugs over at EuroGamer – a decent piece but I’m sure a field with a lot more potential and insight (I’d hope most of the remarks about code being a point of pride are common sense, but I guess they may not be to all people)- I wonder if in 100 year’s time we’ll see students taught of the great code and software revolution like we do literature today and explore some of the kludgy fixes and beautiful examples just as we do the works of Shakespeare.

The BBC’s news item on Bitcoin is solid but perhaps overly dramatic in identifying the “fall” of bitcoin – I imagine debates held in bank vaults in the early days of modern finances were often much fiercer than this one – I hope to look back at this link in a few years time and wonder what all the fuss was about.

The Verge’s deep dive into SoundCloud’s current dark era is also a great read. I’m not a massive music fan, but SoundCloud previously stood out to me as a source of great music, especially mixes, and now it seems like they’re a music graveyard – and not too surprising given the direction they’ve gone – it’s really hard to see where any of the swings they took were aimed at in this piece and it’s hard to see how they blew their early start quite so badly.

Weekend reads

My favourite read of the weekend is the Verge’s item on Biohacking – when I first read about people adding magnets and chips to their bodies I was fascinated but was totally unconvinced this was the future (since the external application of such functions would work probably just as well) and it turns out I was probably right- obsolesce and lack of traction makes this tech less and less appealing. With that said, if someone could offer me magnetic rings, non-permanent patches, and clothing which accepted tech and offered extensions to the feedback systems around me (I remember and would love to try a vibrate strip on the back that nudged you in the right direction for navigation) I would love to give it a go- otherwise tech moves too quick for such systems to be my cup of tea.

On the gaming side, a few people have linked to the Dirty Gamification article that lays out the “horrors” of a perfect feedback circle in games. I still hold out hope that such loops can be used for good, but I’ll be honest and say I bounce right off games like this so wonder if I’ve got some immunity (no doubt from playing other games earlier in my life!) so am not sure that even if used for good it would truly benefit me.

Away from reads (and decorating!), I tried out Gigantic and found it… generic. The concept feels very much like Battleborn before it, a hybrid of DOTA and a traditional shooter with some nice moments in terms of the back and forth between the guardians (effectively all your attacks are a build up for a big boss fight) but there really wasn’t much to hold my attention and after a match or two I already found myself asking why I’d play this over any other game. Perhaps with friends it could be a fun romp, but on it’s own it’s hard to pick out any reason why this would succeed where others have failed.

Friday Reads

It feels like only earlier in the week (it was) that I lamented the lack of nonviolent verbs and then comes a review with a game with no violence at all – and it gets a good review over at RPS – Yonder. Equally, it’s nice to see people exploring sports games in Kickmen – a field that I feel is still under explored in gaming (imagine tweaking all the physics and individual rules for traditional sports on the fly (just like kids often find themselves doing)). If nothing else, the recent explosion of indie games and ease of entry into game creation has opened up the chances for titles like these getting some interest.

Perhaps rife for traditional game play, Have Gun Will Travel takes an interesting look at (in my mind at the very least) some extreme “prepping” – only a matter of time before the videogame adaptation hits your stores…

I’m a big fan of Cory Doctorow books, so the Verge’s interview made me particularly happy (since I didn’t realise he had a new book out – that’ll teach me to sign up to newsletters for such things!). I always enjoy his universes and, whilst his writing style is perhaps a bit light, his concepts are always rock solid so I’m looking forward to his latest.

Tuesday Reads

For years the specialist press has talked about security issues but now we’re really seeing the impact thanks to Petya. The lessons learned from this are hopefully ones everyone already knows (patching) but I still see people who are apathetic around this topic – it’s not rocket science people, accept that patches are a fact of life and plan for them, not hope you don’t get caught out.

Rock Paper Shotgun totally stole one of my ideas (I’ll let them off with it) with Curated Stores: I think there’s a great opportunity for some custom front ends with specialisms in Steam but I also think there are challenges with any such approach (discover-ability and making them sustainable (e.g. pay the bills)).

Monday reads

Dead Cell’s weapon conversation over at Gamasutra is a good piece even if the type of game is a big turn off to me (difficulty levels are not my friend unless they have various shades of easy!). So many weapon choices (my FPS childhood taught me that one weapon per number key is probably optimum) and yet… The number of words gamers (and game designers) have available for use in combat is truly fascinating, and juxtaposes nicely with words for… the countryside over at the Guardian (old, but good) and makes me wonder what a game that used these words, and not the countless words the human world has built to describe pointy sticks and ammunition throwing things would look like (if it is, indeed, possible). It’s not dig at Dead Cell (which looks beautifiul and, by all accounts, looks like a great game with a lot of love behind it) but I find it interesting none-the-less to think of the effort put into the gaming dictionary and why the word list from gaming manuals would probably make for an awful corpus of text (should game manuals be the only text left after the end of the world/found by aliens trying to understand our culture).

On the technology front, I was intrigued by the idea of a “Smell Dectector” (concern that summer sun is getting the best of me is a regular feeling!) – and speaking of smelly, the news of SnapChat’s fight with Facebook continues: and reminds me that Facebook’s behavior is still pretty poor for a company that pushes heavily on it’s morale agenda (I expected more from a social network than traditional technology companies, but that’s probably just me being naive). Via LinkedIn, I came across a good piece exploring the challenges of the changing industry and the need for us to start thinking “post capitalism” and I can’t disagree that we need to build better foundations for the future rather than just carrying on relentlessly forward (there’s no need to stop progress, but it should be done with awareness at the same time). At least one tech news piece I read this morning had some positive progress – there’s some really nice features in the latest test editions of Windows Server.

Finally, the “how did this get past the editors” award goes to PC Gamers’ article on DNS servers – it’s worrying when the comments seem better structured and more technically accurate than the article itself.