Web Search

Custom Search

Search Results

Saturday, March 28, 2020

Fuchsia Malaise Playtest Sessions


Last week, I ran three different sessions in my Cha'alt campaign.  This post will highlight the good, the bad, and the ugly.

Session One


This was on Roll20, and it was kind of a disaster.  Every player seemed to be on a different page or possibly different book.

They arrived at the golden gates of A'agrybah, and learned a bit about the city and taxes.  Apparently, paying taxes is a fate worse than death because several PCs (one in particular) was preoccupied by imminent tax collection for the entire scenario.  Always looking for somewhere to hide his meager wealth so the tax man doesn't take a cut.

Another PC just stalked the guy afraid of taxes like a creepy ex-girlfriend.  It was weird.

A third PC wanted to go off completely on his own to the palace (the rest of the party was in the tavern making friends).  And when he got there, he wanted to speak directly to the King.  That didn't happen.  Instead, the PC talked to a servant and when that wasn't good enough, he attacked a royal guard.  The PC almost killed the guard, but reinforcements were called.  And when multiple guards eventually killed the PC, the player complained about poor dice rolls.  Really, dude?  You chalk that fate up to bad luck?  Wow, ok...

A fourth PC decided he would take down the biggest guy in the bar who had two companions sitting next to him.  That lasted a whole 2 rounds until the PC was skewered dead on the end of the barbarian's obsidian blade.

The fifth PC tried to make the most of his adventuring time.  I felt bad for him, since his party just collapsed under the weight of their own distraction or incompetence.


Session Two


Same set-up as before (also a Roll20 game), the PCs enter A'agrybah.  This time they enter the marketplace, find a guide, and get a tour... until a couple of thieves make off with their coin purses.

They track the thieves to a blind alley and combat takes place.  It's quite the battle, ending with the most murderhobo PC ripping the head off a thief when he was trying to run away.

It was fun, and everyone had a great time!

Each of those games lasted about 70 minutes, so there wasn't much time for anything else.


Session Three


This was face-to-face at my FLGS, and a blissful four-hours long.  I'd been waiting weeks for this game.  Crummy weather (ice and snow) almost ruined things, but luckily there was a window.

BTW, all three sessions used my Crimson Dragon Slayer D20 hack of both OSR and 5e.  FYI, the cleric isn't broken!  Not only does he allow the party to keep fighting past the 2nd encounter without having to rest for the day, he allows the wizard to keep casting spells (both during and between combats).

Patrick played a cleric moon-elf, Pat played a fighter blood-elf, Michael played another fighter blood-elf, and Steve played a sky-elf wizard (who became his own familiar).

Three of the four players ran through The Black Pyramid a couple months ago.  Instead of starting at 1st level, I suggested they make 3rd level characters and rolled on that d100 past event random table from How To Game Master Like A Fucking Boss.

I introduced some Fuchsia Malaise backstory - the PCs' settlement had been destroyed by off-worlder invaders intent on draining Cha'alt of its most precious resource - zoth.

In order to successfully raid the off-worlder's base, Elysium, they'd need either high-tech weapons, magic items, or both.

Rumors of relics and artifacts within The Black Pyramid abound, so off they went.  As per usual, just the tip of the pyramid was visible, shiny black, the majority submerged beneath irradiated sand.

They met a demon attempting to open up a gateway to some Demon Lord, and decided to help him.  Each PC was bestowed with an infernal blessing, and the gate-opening demon became an NPC hanger-on.

The PCs spent a little time with The Community, but nothing substantial happened there.  Then, they wandered into a Tiki bar and chatted with a demonic vacuum salesman (vacuum sales-demon?) and the negotiations began.  The vacuum demonstration included sucking up some NPC into another dimension (ok, maybe it was demonic).  Two vacuums were purchased.

After that transaction, the PCs felt this would be the perfect initiation for their new demon friend, Qa'atz.  His rite of passage will be to kill the salesman and loot his body as the other PCs watched.

Qa'atz got advantage on a surprise stab to the stomach. Sadly, I rolled badly and Qa'atz missed horribly.  The salesman backed away and disintegrated him in one shot (rolled bad on the saving throw, too).

Disavowing anything to do with Qa'atz, the PCs made their way to a room where a female demon (wow, lots of demons in this session... even for me!) was gifted a magic sword by an infernal council.  The PCs agreed to help her by being their champion and killing the titan Za'argon so she could have a ridiculously large ruby.

The magic sword had an unbreakable glass pommel and a variable plus to hit and damage, so one player suggested the current "+" number would be visible inside the little glass sphere as an indicator.  A brilliant idea and the kind of thing that could only happen in an RPG.

A hive of reptilian insects was in another room, and they became fodder for a fireball spell, as well as, the magic blade.  A decent amount of treasure was found searching the honeycombs.

A wandering humanoid offered to sell his own magic sword.  It had a strange name, Kenyur-Trova'ak. Not having a translation handy, I looked for the closest thing in the Viridian glossary at the back of How To Game Master Like A Fucking Boss and came up with "passionate oblivion"... a better translation would be "the strength of nothingness."  Then ended up trading a vacuum and a turquoise slab for the sword.

The last room before Za'argon was full of his devoted worshipers who occasionally offered themselves to the minor god when he was feeling especially peckish.  The PCs didn't think much of them, playing strategy games and plucking their zita'ars.

The PCs had a whiz-bang idea of removing the chartreuse sphere from an adjacent triangle-shaped room so the worshipers could use it as a new age music room with excellent "triangular acoustics".

Unfortunately, Za'argon needed to roll a 1 for him to fail his save.  I rolled a 3, which was damn close.  So, he didn't immediately die.  But the sky-elf wizard did lob a fireball at him.  This little table was rolled on.  The wizard's player rolled a 6.  Ouch!  Knocking half its Hit-Points down, the rest of them dealt damage like true adventurers.

The cleric dropped down onto the titan's head so he could dish out a holy invocation to Lovecraftian abominations.  The fighters (they both had magic swords, but especially the one with the variable +) wailed on him.

Za'argon slapped all of them around with a couple rounds of tentacles, practically killing the cleric.  The wizard asked if it was possible for him to cast a spell to save his companion's life.  I deemed that it was.  The wizard could try preventing the cleric's soul from leaving his body.  The wizard cast his spell and the cleric would need to make a saving throw.  Luckily, he did.

Finally, the killing blow cut the titan in twain.  The PCs looted his chamber, the demon sorceress took her giant ruby, and her champion did not return the sword even though she was done with him.  She vanished into thin air before the PCs could turn on her.

Za'argon's horn bestowed enough power making the party's spell-caster a Very Powerful Wizard (at long last).  He used that temporary power to destroy the enemies within Elysium.  Also, the PCs became 4th level.

Having survived The Black Pyramid was no easy feat.  For years and years, they will be able to tell their children and grandchildren of their bravery, cunning, and unbelievable fortune!

VS

p.s. One thing I like to do is look back at my sessions and see if there's something I could have done differently, that could be improved upon.  The demon sorceress should have been sexier and a potential love-interest for one or more PCs.

Star Wars Jedi Fallen Order Free Download

star-wars-jedi-fallen-order-pc-cover

A galaxy-spanning adventure awaits in Star Wars Jedi: Fallen Order, a new third-person action-adventure title from Respawn Entertainment. This narratively driven, single-player game puts you in the role of a Jedi Padawan who narrowly escaped the purge of Order 66 following the events of Episode 3: Revenge of the Sith. On a quest to rebuild the Jedi Order, you must pick up the pieces of your shattered past to complete your training, develop new powerful Force abilities and master the art of the iconic lightsaber – all while staying one step ahead of the Empire and its deadly Inquisitors.

While mastering your abilities, players will engage in cinematically charged lightsaber and Force combat designed to deliver the kind of intense Star Wars lightsaber battles as seen in the films. Players will need to approach enemies strategically, sizing up strengths and weaknesses while cleverly utilizing your Jedi training to overcome your opponents and solve the mysteries that lay in your path.

Star Wars fans will recognize iconic locations, weapons, gear and enemies while also meeting a roster of fresh characters, locations, creatures, droids and adversaries new to Star Wars. As part of this authentic Star Wars story, fans will delve into a galaxy recently seized by the Empire. As a Jedi hero-turned-fugitive, players will need to fight for survival while exploring the mysteries of a long-extinct civilization all in an effort to rebuild the remnants of the Jedi Order as the Empire seeks to erase the Jedi completely.

GAMEPLAY AND SCREENSHOTS :

star-wars-jedi-fallen-order-pc-screenshot-4 
star-wars-jedi-fallen-order-pc-screenshot-3 
star-wars-jedi-fallen-order-pc-screenshot-2 
star-wars-jedi-fallen-order-pc-screenshot-1  

DOWNLOAD GAME:

♢ Click or choose only one button below to download this game.
♢ View detailed instructions for downloading and installing the game here.
♢ Use 7-Zip to extract RAR, ZIP and ISO files. Install PowerISO to mount ISO files.

Star Wars Jedi Fallen Order Free Download
http://pasted.co/af29b5ae

INSTRUCTIONS FOR THIS GAME
➤ Download the game by clicking on the button link provided above.
➤ Download the game on the host site and turn off your Antivirus or Windows Defender to avoid errors.
➤ Once the download has been finished or completed, locate or go to that file.
➤ To open .iso file, use PowerISO and run the setup as admin then install the game on your PC.
➤ Once the installation process is complete, run the game's exe as admin and you can now play the game.
➤ Congratulations! You can now play this game for free on your PC.
➤ Note: If you like this video game, please buy it and support the developers of this game.

SYSTEM REQUIREMENTS:
(Your PC must at least have the equivalent or higher specs in order to run this game.)


Minimum:
• Requires a 64-bit processor and operating system
• OS: 64-bit Windows 7/8.1/10
• Processor: AMD FX-6100/Intel i3-3220 or Equivalent
• Memory: 8 GB RAM
• Graphics: AMD Radeon HD 7750, NVIDIA GeForce GTX 650 or Equivalent
• DirectX: Version 11
• Storage: 55 GB available space

Recommended:
• Requires a 64-bit processor and operating system
• OS: 64-bit Windows 7/8.1/10
• Processor: AMD Ryzen 7 1700/Intel i7-6700K or Equivalent
• Memory: 16 GB RAM
• Graphics: AMD RX Vega 56, Nvidia GTX 1070/GTX1660Ti or Equivalent
• DirectX: Version 11
• Storage: 55 GB available space
Supported Language: English, French, Italian, German, Spanish language are available.
If you have any questions or encountered broken links, please do not hesitate to comment below. :D

Music From The Lost Realm

Music – the thing we listen to while we drive the car, exercise, walk around, meet with friends, live. Music makes us dance, it keeps us on our toes, it makes us cry and sing along. It enhances and sometimes manipulates our feelings until we feel part of the story being told, or actually until we really feel there, side by side with our favorite heroes.

Video games abide by this rule too – players need to know that the fate of the world depends on what they're going to do next. How we underline their actions, especially with music, has big repercussions on the feelings a game can evoke.

In November we met with Kalle Ylitalo, composer of Oceanhorn 2: Knights of the Lost Realm, to discuss what fans can expect from the new soundtrack. Kalle had already worked on the Oceanhorn: Monster of the Uncharted Seas original soundtrack, alongside Japanese legends such as Nobuo Uematsu and Kenji Ito. When we met, Kalle was taking a small break with Arttu Jauhiainen, flutist, one of the six talented musicians attached to the project.

"One aspect that makes composing for Oceanhorn so pleasant, is that I can concentrate on creating beautiful melodies which often have a hint of Finnish folk music in them," says Ylitalo. "This is something that comes very intuitively for me, so composing music for both the new title and the previous has felt very natural. These melodies have their roots in my early childhood when my mom used to sing a lot of Finnish folk songs to me and my brother."



His involvement in the project has roots in the past: "I've been a friend of Heikki (Repo, Cornfox & Brothers Creative Director and Co-founder) for a long time. Back then, as a teenager, he was already developing games. I don't think I helped him out with the music at that time, but I am now!"



Kalle is currently composing the Oceanhorn 2: Knights of the Lost Realm soundtrack in a new direction. "I feel that the first game was more of a classic adventure, and the soundtrack reflected that. Oceanhorn 2 has more diverse elements, and I've been trying to create musical traditions for each of the cultures in the game." The Pirta theme is a good example of that. There's a shakuhachi flute there, and steam-pipes sounds. "I'm really happy with those tracks because they sound like no existing music culture that I know of," adds Ylitalo. "The role of the real instruments and musicians here is to really make the score come alive. Everyone can hear the difference it makes when a talented professional interprets a melody, compared to a midi-instrument."



Arttu has worked on many different projects, but this is the first game he works on. "I haven't had a chance to play the game yet, but can't wait to try it when the music is implemented, can't be anything other than great!"



Along with Arttu (Flute, Piccolo, Alto Flute), the game's score features Lauri Sallinen (Clarinet, Bass clarinet), Sanna Niemikunnas (Oboe, English horn), Rista Tuura (Violin), Anna Grundström (Cello), and József Hárs (Horn).

"In the first game, there was this flute theme played by the protagonist's father", said Kalle when we asked about how's the main team shaping up. "It was an excellent melody, so I decided to base the main theme on it. That is the only melody in Oceanhorn 2 we have used in multiple tracks. I can't say anything more, or it would be a spoiler!"

--

Want to receive these updates before anyone else? Subscribe to the Oceanhorn newsletter on https://www.oceanhorn.com


Monday, March 23, 2020

So Far Behind...

   I haven't talked about so many gaming things happening in my life the last few weeks.

  First, I went to DesotoCon, in Kansas, back at the end of July. I started a blog post about it and will finish it, I promise. It's even going to be back dated so it will appear before this one. Not many (any?) photos from it though. Well, a few, I think.

  The next week I went to Indianapolis for GenCon. Met a lot of great people, hung out with some friends from Thread Raiders, Saving Throw Show, and Dragons and Things (best Pathfinder liveplay stream, Fridays at 6:00 Pacific on Twitch). Bought a bunch of stuff. Again, it deserves it's own post and I will work on that. A few more photos there.

   I've also released the first product from Goblyn Head Press on DriveThruRPG. It's a supplement designed for D&D 5e called Sacred Sites. It was written by Eli Arndt, who you ugys have seen me mention before around here. Nine different places you can encounter the sacred or profane. It has sold a few copies already and it's only been up about a week. Very excited about that. Probably deserves it's own post, too.


  And we've gotten a few more sessions of Starfinder in. Kicked one guy out of our group, got another new player. Still sitting at three players so if anyone wants to join us in Santa Fe, TX (in Galveston County, on the mainland)...

   And I painted a few minis. Not much. I really need to get to work on the Pledge or I am screwed.

   Oh, two new display cases came in and I got one put together. Detolf from IKEA.


   And I have been drawing more maps on my Wacom tablet. So that's getting me closer to done with another Goblyn Head project.

   All in all, I guess I have been busy. Just not very good at reporting. I'll try to get caught up on all of that the next few days.

Friday, March 20, 2020

Exploring Monster Taming Mechanics In Final Fantasy XIII-2: Data Validation And Database Import

Continuing on with this miniseries of exploring the monster taming mechanics of Final Fantasy XIII-2, it's time to start building the database and populating it with the data that we collected from the short script that we wrote in the last article. The database will be part of a Ruby on Rails project, so we'll use the default SQLite3 development database. Before we can populate the database and start building the website around it, we need to make sure the data we parsed out of the FAQ is all okay with no typos or other corruption, meaning we need to validate our data. Once we do that, we can export it to a .csv file, start a new Rails project, and import the data into the database.


Validating a Collection of Data

Considering that we parsed out 164 monsters with dozens of properties each from the FAQ, we don't want to manually check all of that data to make sure every property that should be a number is a number and all property names are correctly spelled. That exercise would be way too tedious and error prone. This problem of validating the data sounds like it needs an extension to our script. Since we have the data in a list of hash tables, it should be fairly straightforward to create another hash table that can be used to validate each table in the list. The idea with this hash table is to have a set of valid properties as the keys in the table, and the values are regexes that should match each property value that they represent. These regexes will be more specific to each property, since those properties have already matched on the more general regexes that were used to collect the data in the first place. Additionally, every key in each monster hash should be in this template hash, and every template hash key should be in each monster hash. We could get even more detailed with our checks, but this validation should be enough to give us confidence in the data.

To get started, we'll build up the first couple entries in the template hash and write the validation loop. Once it's working, we can fill out the rest of the entries more easily. Here are the name and minimum base HP entries along with the validation loop:
PROPER_NAME_REGEX = /^\w[\w\s]*\w$/
NUMBER_REGEX = /^\d+(:?,\d{3})?$/

VALID_MONSTER = {
"Name" => PROPER_NAME_REGEX,
"Minimum Base HP" => NUMBER_REGEX
}

data.each do |monster|
VALID_MONSTER.each do |key, regex|
if monster.key?(key)
unless monster[key] =~ regex
puts "Monster #{monster["Name"]} has invalid property #{key}: #{monster[key]}."
end
else
puts "Monster #{monster["Name"]} has missing property #{key}."
end
end

monster.each do |key, value|
unless VALID_MONSTER.key?(key)
puts "Monster #{monster["Name"]} has extra property #{key}: #{value}."
end
end
end
This is a fair amount of code, so let's take it in parts. First, we define two regexes for a proper name and a number. The proper name regex is the same as part of our previous property value regex in that it matches on multiple words separated by whitespace, but it has two extra symbols at the beginning and end. The '^' at the beginning means that the next character in the pattern has to appear at the start of the string, and the '$' at the end means that the last character that matches has to be at the end of the string. Together, these symbols mean that the entire string needs to match the regex pattern.

The number regex is similar to the proper name regex, except that it matches on numbers instead of words. The (:?,\d{3}) group matches on a comma followed by three digits because the {3} pattern means that the previous character type, in this case a digit, must be repeated three times. This group is optional, so the regex will match on 1234 as well as 1,234. The number regex is also wrapped in a '^' and a '$' so that the entire string must match the pattern.

The next constant is simply the start of our monster template hash with "Name" and "Minimum Base HP" entries. What follows is the validation loop, and it is laid out about how it was described. First, we iterate through each monster in the data list that we have already populated with the monsters from the FAQ. Within each monster we iterate through every entry of the valid monster template. If the monster has the property we're looking at, we check if the property value matches the regex for that property. If it doesn't, we print out an error. If the property doesn't exist, we print out a different error. Then we iterate through every property of the monster, and if a property doesn't exist in the template, we print out another error.

If we run this script now, we end up with a ton of errors for extra properties because we haven't added those properties to the template, yet. However, from looking at the first few monster's outputs, it appears that the other checks are working, so we can start filling out the rest of our template. We can quickly add in the obvious properties, checking the script periodically to make sure we haven't gone astray. The mostly finished template looks like this:
PROPER_NAME_REGEX = /^\w.*[\w)!%]$/
NUMBER_REGEX = /^\d+(:?,\d{3})?$/
SMALL_NUMBER_REGEX = /^(\d\d?\d?|N\/A)$/
PERCENTAGE_REGEX = /^(\d\d?\d?%|N\/A)$/
LIST_REGEX = /^((:?All )?\w+(:?, (:?All )?\w+)*|N\/A)$/
FREE_TEXT_REGEX = /^\S+(?:\s\S+)*$/
TIME_REGEX = /^\d\d?:\d\d$/

VALID_MONSTER = {
"Name" => PROPER_NAME_REGEX,
"Role" => PROPER_NAME_REGEX,
"Location" => PROPER_NAME_REGEX,
"Location2" => PROPER_NAME_REGEX,
"Location3" => PROPER_NAME_REGEX,
"Max Level" => SMALL_NUMBER_REGEX,
"Speed" => SMALL_NUMBER_REGEX,
"Tame Rate" => PERCENTAGE_REGEX,
"Minimum Base HP" => NUMBER_REGEX,
"Maximum Base HP" => NUMBER_REGEX,
"Minimum Base Strength" => SMALL_NUMBER_REGEX,
"Maximum Base Strength" => SMALL_NUMBER_REGEX,
"Minimum Base Magic" => SMALL_NUMBER_REGEX,
"Maximum Base Magic" => SMALL_NUMBER_REGEX,
"Growth" => PROPER_NAME_REGEX,
"Immune" => LIST_REGEX,
"Resistant" => LIST_REGEX,
"Halved" => LIST_REGEX,
"Weak" => LIST_REGEX,
"Constellation" => PROPER_NAME_REGEX,
"Feral Link" => PROPER_NAME_REGEX,
"Description" => FREE_TEXT_REGEX,
"Type" => PROPER_NAME_REGEX,
"Effect" => FREE_TEXT_REGEX,
"Damage Modifier" => FREE_TEXT_REGEX,
"Charge Time" => TIME_REGEX,
"PS3 Combo" => FREE_TEXT_REGEX,
"Xbox 360 Combo" => FREE_TEXT_REGEX,
"Default Passive" => PROPER_NAME_REGEX,
"Default Skill" => PROPER_NAME_REGEX,
"Special Notes" => FREE_TEXT_REGEX,
}
Notice that the PROPER_NAME_REGEX pattern had to be relaxed to match on almost anything, as long as it starts with a letter and ends with a letter, ')', '!', or '%'. This compromise had to be made for skill names like "Strength +10%" or constellation names like "Flan (L)" or feral link names like "Items Please!" While these idiosyncrasies are annoying, the alternative is to make much more specific and complicated regexes. In most cases going to that extreme isn't worth it because the names that are being checked will be compared against names in other tables that we don't have, yet. Those data validation checks can be done later during data import when we have the other tables to check against. Waiting and comparing against other data reduces the risk that we introduce more errors from making the more complicated regexes, and we save time and effort as well.

The location property has an odd feature that makes it a bit difficult to handle. Some monsters appear in up to three different areas in the game, but it's only a handful of monsters that do this. Having multiple locations combined in the same property is less than ideal because we'll likely want to look up monsters by location in the database, and we'll want to index that field so each location value should be a unique name, not a list. Additionally, the FAQ puts each location on a separate line, but not prefixed with the "Location-----:" property name. This format causes problems for our script. To solve both problems at once, we can add "Location2" and "Location3" properties anywhere that a monster has a second or third location by directly editing the FAQ.

This template covers nearly all of the monster properties, except for the level skill and passive properties. We'll get to those properties in a second, but first we have another problem to fix. It turns out that the two location properties we added and the last three properties in the template don't always occur, so we have to modify our check on those properties slightly:
# ...
elsif !["Location2", "Location3", "Default Passive", "Default Skill", "Special Notes"].include? key
puts "Monster #{monster["Name"]} has missing property #{key}."
end
# ...
We simply change the else branch of the loop that checks that all properties in the template are in the monster data so that it's an elsif branch that only executes if the key is not one of those optional keys.

Now we're ready to tackle the level properties. What we don't want to do here is list every single level from 1 to 99 for both skill and passive properties. There has to be a better way! The easiest thing to do is add a check for if the key matches the pattern of "Lv. XX (Skill|Passive)" in the loop that checks if each monster property exists in the template, and accept it if the key matches and the value matches the PROPER_NAME_REGEX. This fix is shown in the following code:
LEVEL_PROP_REGEX = /^Lv\. \d\d (Skill|Passive)$/
# ...
monster.each do |key, value|
unless VALID_MONSTER.key?(key)
if key =~ LEVEL_PROP_REGEX
unless value =~ PROPER_NAME_REGEX
puts "Monster #{monster["Name"]} has invalid level property #{key}: #{value}."
end
else
puts "Monster #{monster["Name"]} has extra property #{key}: #{value}."
end
end
end
# ...
I tried to make the conditional logic as simple and self-explanatory as possible. I find that simpler is better when it comes to logic because it's easy to make mistakes and let erroneous edge cases through. If this logic was any more complicated, I would break it out into named functions to make the intent clearer still.

With this addition to the data validation checks, we've significantly reduced the list of errors from the script output, and we can actually see some real typos that were in the FAQ. The most common typo was using "Lvl." instead of "Lv." and there are other assorted typos to deal with. We don't want to change the regexes to accept these typos because then they'll appear in the database, and we don't want to add code to the script to fix various random typos because that's just tedious nonsense. It's best to fix the typos in the FAQ and rerun the script. It's not too bad a task for these few mistakes.

Exporting Monsters to a CSV File

Now that we have this nice data set of all of the monster properties we could ever want, we need to write it out to a .csv file so that we can then import it into the database. This is going to be some super complicated code. Are you ready? Here it goes:
require 'csv'
opts = {headers: data.reduce(&:merge).keys, write_headers: true}
CSV.open("monsters.csv", "wb", opts) do |csv|
data.each { |hash| csv << hash }
end
Honestly, Ruby is one of my favorite languages. Things that you would think are complicated can be accomplished with ease. Because we already structured our data in a csv-friendly way as an array of hashes, all we have to do is run through each hash and write it out through the CSV::Writer with the '<<' operator.

We need to take care to enumerate all of the header names that we want in the .csv file, and that happens in the options that are passed to CSV.open. Specifically, headers: data.reduce(&:merge).keys tells the CSV::Writer what the list of header names is, and the writer is smart enough to put blank entries in wherever a particular header name is missing in the hash that it is currently writing out to the file. The way that code works to generate a list of header names is pretty slick, too. We simply tell the data array to use the Hash#merge function to combine all of the hashes into one hash that contains all of the keys. Since we don't care about the values that got merged in the process, we simply grab the keys from this merged hash, and voila, we have our headers.

The .csv file that's generated from this script is a real beast, with 204 unique columns for our 164 monsters. Most of those columns are the sparsely populated level-specific skills and passive abilities. We'll have to find ways to deal with this sparsely populated matrix when using the database, but it should be much better than dealing with one or two fields of long lists of abilities. At least, that's what I've read in books on database design. I'm learning here, so we'll see how this goes in practice.

Importing Monsters Into a Database

This part isn't going to be quite as easy as exporting because we'll need to write a database schema, but it shouldn't be too bad. Before we get to that, we need to create a new Ruby on Rails project. I'll assume Ruby 2.5.0 or higher and Rails 6.0 are installed. If not, see the start of this Rails Getting Started guide to get that set up. We start a new Rails project by going to the directory where we want to create it and using this Rails command:
$ rails new ffxiii2_monster_taming
Rails generates the new project and a bunch of directories and files. Next, we descend into the new project and create a new model for monsters:
$ cd ffxiii2_monster_taming
$ rails generate model Monster name:string
In Rails model names are singular, hence "Monster" instead of "Monsters." We also include the first database attribute that will be a part of the migration that is generated with this command. We could list out all 204 attributes in the command along with their data types, but that would be terribly tedious. There's an easier way to get them into the migration, which starts out with this code to create the Monster table:
class CreateMonsters < ActiveRecord::Migration[6.0]
def change
create_table :monsters do |t|
t.string :name

t.timestamps
end
end
end
All we have to do is add the other 203 attributes along with their data types and we'll have a complete table ready to generate, but how do we do this efficiently? Conveniently, we already have a list of the attribute names as the header line in the monsters.csv file. We just have to copy that line into another file and do some search-and-replace operations on it to get the list into a form that can be used as the code in this migration file.

First, we'll want to make a couple changes in place so that the .csv header has the same names as the database attributes. This will make life easier when we import. All spaces should be replaced with underscores, and the periods in the "Lv." names should be removed. Finally, the whole line should be converted to lowercase to adhere to Rails conventions for attribute names. Once that's done, we can copy the header line to a new file, replace every comma with a newline character, and replace each beginning of a line with "      t.string " to add in the attribute types. They are almost all going to be strings, and it's simple to go back and change the few that are not to integers, floats, and times. I did this all in Vim, but any decent text editor should be up to the task. Now we have a complete migration file:
class CreateMonsters < ActiveRecord::Migration[6.0]
def change
create_table :monsters do |t|
t.string :name
t.string :role
t.string :location
t.string :location2
t.string :location3
t.integer :max_level
t.integer :speed
t.string :tame_rate
t.string :growth
t.string :immune
t.string :resistant
t.string :halved
t.string :weak
t.string :constellation
t.integer :minimum_base_hp
t.integer :maximum_base_hp
t.integer :minimum_base_strength
t.integer :maximum_base_strength
t.integer :minimum_base_magic
t.integer :maximum_base_magic
t.string :feral_link
t.string :description
t.string :monster_type
t.string :effect
t.float :damage_modifier
t.time :charge_time
t.string :ps3_combo
t.string :xbox_360_combo
t.string :default_passive
t.string :default_skill
t.string :special_notes
t.string :lv_02_passive
t.string :lv_02_skill
#...
# over a hundred more lv_xx attributes
#...
t.string :lv_99_passive
t.string :lv_99_skill

t.timestamps
end
end
end
Now, we can run this migration with the command:
$ rails db:migrate
And we have the beginnings of a monster table. We just need to populate it with our monsters. Rails 6.0 makes this task quite simple using a database seed file, and since we have the same names for the database attributes as the .csv file column headers, it's dead simple. In the lib/tasks/ directory, we can make a file called seed_monsters.rake with the following code:
require 'csv'

namespace :csv do

desc "Import Monster CSV Data"
task :import_monsters => :environment do

csv_file_path = 'db/monsters.csv'

CSV.foreach(csv_file_path, {headers: true}) do |row|
Model.create!(row.to_hash)
puts "#{row['name']} added!"
end
end
end
When we run this task, the code is going to loop through each line of the .csv file (that we make sure to put in db/monsters.csv), and create a monster in the database for each row in the file. We also print out the monster names so we can see it working. Then it's a simple matter of running this command:
$ rails db:seed
And we see all of the monster names printed out to the terminal, and the database is seeded with our 164 monsters.

We've accomplished a lot in this post with running some validation checks on the monster data, exporting it to a .csv file, creating a database table, and importing the monsters.csv file into that table. We still have plenty to do, creating and importing the other tables and relating the data between tables. That will be the goal for next time.

Thursday, March 19, 2020

Tech Book Face Off: Effective Python Vs. Data Science From Scratch

I must confess, I've used Python for quite some time without really learning most of the language. It's my go-to language for modeling embedded systems problems and doing data analysis, but I've picked up the language mostly through googling what I need and reading the abbreviated introductions of Python data science books. It was time to remedy that situation with the first book in this face-off: Effective Python: 59 Specific Ways to Write Better Python by Brett Slatkin. I didn't want a straight learn-a-programming-language book for this exercise because I already knew the basics and just wanted more depth. For the second book, I wanted to explore how machine learning libraries are actually implemented, so I picked up Data Science from Scratch: First Principles with Python by Joel Grus. These books don't seem directly related other than that they both use Python, but they are both books that look into how to use Python to write programs in an idiomatic way. Effective Python focuses more on the idiomatic part, and Data Science from Scratch focuses more on the writing programs part.

Effective Python front coverVS.Data Science from Scratch front cover

Effective Python

I thought I had learned a decent amount of Python already, but this book shows that Python is much more than list comprehensions and remembering self everywhere inside classes. My prior knowledge on the subjects in the first couple chapters was fifty-fifty at best, and it went down from there. Slatkin packed this book with useful information and advice on how to use Python to its fullest potential, and it is worthwhile for anyone with only basic knowledge of the language to read through it.

The book is split into eight chapters with the title's 59 Python tips grouped into logical topics. The first chapter covers the basic syntax and library functions that anyone who has used the language for more than a few weeks will know, but the advice on how to best use these building blocks is where the book is most helpful. Things like avoiding using start, end, and stride all at once in slices or using enumerate instead of range are good recommendations that will make your Python code much cleaner and more understandable.

Sometimes the advice gets a bit far-fetched, though. For example when recommending to spell out the process of setting default function arguments, Slatkin proposed this method:

def get_first_int(values, key, default=0):
    found = values.get(key, [''])
    if found[0]:
        found = int(found[0])
    else:
        found = default
    return found
Over this possibility using the or operator short-circuit behavior:
def get_first_int(values, key, default=0):
    found = values.get(key, [''])[0]
    return int(found or default)
He claimed that the first was more understandable, but I just found it more verbose. I actually prefer the second version. This example was the exception, though. I agreed and was impressed with nearly all of the rest of his advice.

The second chapter covered all things functions, including how to write generators and enforce keyword-only arguments. The next chapter, logically, moved into classes and inheritance, followed by metaclasses and attributes in the fourth chapter. What I liked about the items in these chapters was that Slatkin assumes the reader already knows the basic syntax so he spends his time describing how to use the more advanced features of Python most effectively. His advice is clear and direct so it's easy to follow and put to use.

Next up is chapter 5 on concurrency and parallelism. This chapter was great for understanding when to use threads, processes, and the other concurrency features of Python. It turns out that threads and processes have unique behavior (beyond processes just being heavier weight threads) because of the global interpreter lock (GIL):
The GIL has an important negative side effect. With programs written in languages like C++ or Java, having multiple threads of execution means your program could utilize multiple CPU cores at the same time. Although Python supports multiple threads of execution, the GIL causes only one of them to make forward progress at a time. This means that when you reach for threads to do parallel computation and speed up your Python programs, you will be sorely disappointed.
If you want to get true parallelism out of Python, you have to use processes or futures. Good to know. Even though this chapter was fairly short, it was full of useful advice like this, and it was possibly the most interesting part of the book.

The next chapter covered built-in modules, and specifically how to use some of the more complex parts of the standard library, like how to define decorators with functools.wraps, how to make some sense of datetime and time zones, and how to get precision right with decimal. Maybe these aren't the most interesting of topics, but they're necessary to get right.

Chapter 7 covers how to structure and document Python modules properly when you're collaborating with the rest of the community. These things probably aren't useful to everyone, but for those programmers working on open source libraries it's helpful to adhere to common conventions. The last chapter wraps up with advice for developing, debugging, and testing production level code. Since Python is a dynamic language with no static type checking, it's imperative to test any code you write. Slatkin relates a story about how one programmer he knew swore off ever using Python again because of a SyntaxError exception that was raised in a running production program, and he had this to say about it:
But I have to wonder, why wasn't the code tested before the program was deployed to production? Type safety isn't everything. You should always test your code, regardless of what language it's written in. However, I'll admit that the big difference between Python and many other languages is that the only way to have any confidence in a Python program is by writing tests. There is no veil of static type checking to make you feel safe.
I would have to agree. Every program needs to be tested because syntax errors should definitely be caught before releasing to production, and type errors are a small subset of all runtime errors that can occur in a program. If I was depending on the compiler to catch all of the bugs in my programs, I would have a heckuva lot more bugs causing problems in production. Not having a compiler to catch certain classes of errors shouldn't be a reason to give up the big productivity benefits of working in a dynamic language like Python.

I thoroughly enjoyed learning how to write better Python programs through the collection of pro tips in this book. Each tip was focused, relevant, and clear, and they all add up to a great advanced level book on Python. Even better, the next time I need to remember how to do concurrency or parallelism or how to write a proper function with keyword arguments, I'll know exactly where to look. If you want to learn how to write Python code the Pythonic way, I'd highly recommend reading through this book.

Data Science from Scratch

I didn't expect to enjoy this book quite as much as I did. I went into it expecting to learn about how to implement the fundamental tools of the trade for data science, and that was indeed what I got out of the book. But I also got a lighthearted, entertaining, and surprisingly easy-to-read tour of the basics of machine learning using Python. Joel Grus has a matter-of-fact writing style and a dry wit that I immediately took to and thoroughly enjoyed. These qualities made a potentially complex and confusing topic much easier to understand, and humorous to boot, like having an excellent tour guide in a museum that can explain medieval culture in detail while cracking jokes about how toilet paper wasn't invented until the 1850s.

Of course, like so many programming books, this book starts off with a primer on the Python language. I skipped this chapter and the next on drawing graphs, since I've had just about enough of language primers by now, especially for languages that I kind of already know. The real "from scratch" parts of the book start with chapter 4 on linear algebra, where Grus establishes the basic functions necessary for doing computations on vectors and matrices. The functions and classes shown throughout the book are well worth typing out in your own Python notebook or project folder and running through an interpreter, since they are constantly being used to build up tooling in later chapters from the more fundamental tools developed in earlier chapters. The progression of development from this chapter on linear algebra all the way to the end was excellent, and it flowed smoothly and logically over the course of the book.

The next few chapters were on statistics, probability, and their use with hypothesis testing and inference. Sometimes Grus glossed over important points here, like when explaining standard deviations he failed to mention that this metric only applies to (or at least applies best to) normal distributions. Distributions that deviate too much from the normal curve will not have meaningful standard deviations. I'm willing to cut him some slack, though, because he is covering things quickly and makes it clear that his goal is to show roughly what all of this stuff looks like in simple Python code, not to make everything rigorous and perfect. For instance, here's his gentle reminder on method in the probability chapter:
One could, were one so inclined, get really deep into the philosophy of what probability theory means. (This is best done over beers.) We won't be doing that.
He finishes up the introductory groundwork with a chapter on gradient descent, which is used extensively in the later machine learning algorithms. Then there are a couple chapters on gathering, cleaning, and munging data. He has some opinions about some API authors choice of data format:
Sometimes an API provider hates you and only provides responses in XML.
And he has some good expectation setting for the beginner data scientist:
After you've identified the questions you're trying to answer and have gotten your hands on some data, you might be tempted to dive in and immediately start building models and getting answers. But you should resist this urge. Your first step should be to explore your data.
Data is never exactly in the form that you need to do what you want to do with it, so while the gathering and the munging is tedious, it's a necessary skill that separates the great data scientist from the merely mediocre. Once we're done learning how to whip our data into shape, it's off to the races, which is great because we're now halfway through this book.

The chapters on machine learning models, starting with chapter 12, are excellent. While Grus does not go into intricate detail on how to make the fastest, most efficient MLMs (machine learning models, not multi-level marketing), that is not the point. His objective is to show as clearly as possible what each of these algorithms looks like and that it is possible to understand how they work when shown in their essence. The models include k-nearest neighbors, naive bayes, linear regression, multiple regression, logistic regression, decision trees, neural networks, and clustering. Each of these models is actually conceptually simple, and the models can be described in dozens of lines of code or less. These implementations may be doggedly slow for large data sets, but they're great for understanding the underlying ideas of each algorithm.

Threaded through each of these chapters are examples of how to use each of the statistical and machine learning tools that is being developed. These examples are presented within the context of the tasks given to a new data scientist who is an employee of a budding social media startup for…well…data scientists. I just have to say that it is truly amazing how many VPs a young startup can support, and I feel awfully sorry for this stalwart data scientist fulfilling all of their requests. This silliness definitely keeps the book moving along.

The next few chapters delve a bit deeper into some interesting problems in data science: natural language processing, network analysis (or graph algorithms), and recommender systems. These chapters were just as great as the others, and by now we've built up our data science tooling pretty well from the original basics of linear algebra and statistics. The one thing we haven't really talked about, yet, is databases. That's the topic of the 23rd chapter, where we implement some of the basic operations of SQL in Python in the most naive way possible. Once again it's surprising to see how little code is needed to implement things like SELECT or INNER JOIN as long as we don't give a flying hoot about performance.

Grus wraps things up with an explanation of the great and all-powerfull MapReduce, and shows the basics of how it would be implemented with mapper and reducer functions and the plumbing to string it together. He does not get into how to distribute this implementation to a compute cluster, but that's the topic of other more complicated books. This one's done from scratch so like everything else, it's just the basics. That was all fine with me because the basics are really important, and knowing the basics well can lead you to a much deeper understanding of the more complex concepts much faster than if you were to try to dive into the deep end without knowing the basic strokes. This book provides that foundation, and it does it with flair. I highly recommend giving it a read.


Both Effective Python and Data Science from Scratch were excellent books, and together they could give a programmer a solid foundation in Python and data science as long as they already have some experience in the language. With that being said, Data Science from Scratch will not provide the knowledge on how to use the powerful data analysis and machine learning libraries like numpy, pandas, scikit-learn, and tensorflow. For that, you'll have to look elsewhere, but the advanced, idiomatic Python and fundamental data science principles are well covered between these two books.

Download Arizona Sunshine For PS4

Download Arizona Sunshine For PS4


DarKmooN | CUSA07980 | Update v1.03 | VR | HACKED

  • Release Date: Out Now
  • Genre: Action / Horror / First Person Shooter
  • Publisher: Vertigo Games
  • Developer: Vertigo Games / Jaywalkers Interactive







Arizona Sunshine puts you in the midst of a zombie apocalypse, exclusively in VR. Handle weapons with real-life movements, freely explore a post-apocalyptic world, and put your survival skills to the test with PlayStation®VR - putting the undead back to rest is more thrilling than ever before.


 DOWNLOAD LINKS

  DOWNLOAD DARKMOON VERSION

 DOWNLOAD PART 1
 DOWNLOAD PART 2
 DOWNLOAD UPDATE 1.03:

 CLICK TO DOWNLOAD 


GAME SIZE : 3.8 GB
Password: After 10$ payment is done