LaSalle Software News #7: Inbound Email, Wrap Up V1.2.0, Dive Into Automated Software Testing, Finally!

Monday May 2nd, 2016

Episode Summary

Welcome to my seventh LaSalle Software News podcast.

This is Bob Bloom, from Toronto Canada. 

Today is Monday, May 02nd, 2016. 

I publish LaSalle Software News monthly, at the top of the month, to update you on my LaSalle Software. 



All seemed to stop in April in the pursuit of my first LaSalle Software paid gig. The web app itself is very modest, very targeted - focusing on inbound email processing. But what a journey it turned out to be. In the end, four new packages cames out of it, of which three are the newest additions to my free open source LaSalle Software:

  • standard inbound email handling package (;
  • MailGun inbound email parsing package (;
  • Token based login (

Handling inbound email is a whole different thing than sending outbound emails. You send an email to an email API and whoosh! It's out. What about an inbound email?

That gets routed to an third party email API, which I use The email API parses the email fields -- subject, body, headers, signature, file attachments, etc -- and then sends that all along to your web app, as a POST request. 

Well, I figured it out, in what became one single bloated package. That I then extracted and refactored into the three new FOSS packages.



As wonderful as developing these new features for my first paid LaSalle Software web app is, and it is wonderful. And, by the way, this web app is in a full test live deployment. That is, the deployment is completely live, but it's deployed on a test domain using all my in-house accounts. As wonderful as this is going, I have not progressed otherwise with my LaSalle Software's version 1.2.0 release. 

I'm in the middle of creating customer records initiated by events, rather than by forms. I need to finish this, which included some refactoring/ .

There's a whole bunch of little things that I want to do. In no particular order, perhaps take Envoyer out for a spin; try Cloudflare, custom exception views, pushing to multiple repository sites, use the latest Spatie backup package -- and take Akeeba Solo out for a spin, finish my Service Times package, A lot of mish-mash that's been accumulating on my desk that I want to complete. Not necessarily features, but some dev-ops too. 

I feel like things need to settle down. There's too much mish-mash to go through. I'm in the middle of working on a couple of packages. I want to close out my four software sites and their place do a new web app for my new domain.

I want to add to my burdens and get my Granular Permissions package done. Except that I've concluded that this package is two packages in one: a "Roles and Permissions" package; and, a "Workflow" package. The workflow aspect is something I'm going to defer. Instead, I'm going to focus strictly on the "Roles and Permissions" stuff. Which I *should* be able to do relatively quickly. 

I want to further add to my burdens by building a WordPress-ish "Media Manager" package, although I don't want this particular package to hold up the entire train. So it's deferable. 



I have a meeting coming up to go over my Events package. What I want to do is lay the groundwork for a really good database model -- what I guess is traditionally known as an Entity Relationship Diagram. And then build the back-end exclusively using my Admin Form Automation. This being for my proof-of-concept volunteer Synagogue site, but I want a nice solid generalized Events package with which to grow scope, since, in part, it has ecommerce tie-in's. Building the back-end should be a matter of devoting a few hours. I've learned that it really pays to figure out the database first before embarking on the back-end stuff. 



Once I have version 1.2.0 done, I'm going to carve out time to focus on learning on automated testing. This is a very broad thing, from unit testing to Continuous Integration. But I'm months beyond when I thought I'd deep dive into it, and v1.2.0 is my "make it happen" time. 

There is so much to learn, and there is always so much coming on stream to learn and incorporate into my day-to-day programming. But I think at this point, the most profound thing to learn is automated testing. I used to think that it was API development, but I've concluded that it's near impossible to do API work without the automated testing. 



A month from now, I want LaSalle Software v1.2.0 released. I want my two own web apps, and this freshly deployed. Hopefully, my second LaSalle Software paid gig will come to fruition shortly, and so should be well on my way with that. My journey into automated testing should be well on its way. 

A month from now, I want to be in good shape to start my LaSalleMart API. Lingering things that I wanted to wrap up first will be wrapped up, and so I can focus on my ecommerce. 

My overall goal is to end this year with a really good suite of non-ecommerce software, and to be well on my way with LaSalleMart. This overall goal is looking very good. 



You know when I do these podcasts, I ask myself "should I say this?" and "should I say that?". Usually what happens is that if I'm asking that question, I usually just leave it out. Better to play it safe. Unfortunately, what ends up happening is that important context is left out. 

I just listened to the latest "Laracast Snippet" (link below). It is inspires me to open up -- a bit. 

This month I've been hard on myself for not already having acquired the automated software testing skills that I know I should have. Skills that I planned on working on, by basically stopping everything for a couple of weeks and immersing myself in it. I know that the way I develop software will change profoundly, a change I welcome. But, I need to get over the hump in setting on this new course, which I figure a couple of weeks will solve. The learning will never stop, in fact getting over the hump will be the start of learning automated testing. The signal that I'm well on my way with automated software testing will be the moment when I say to myself, "What took me so long?". 

This business of being hard on myself is not productive. I hear podcasts that say that if I do not test, I'm not a real developer. I just listened to all the back episodes of a podcast devoted exclusively to software testing, and throughout have been kicking myself for not diving in. Which has put me in some type of conflict, where I continue programming as I have in order to keep my development moving along but fill my head with internal talk that borderlines interferes with my programming. 

I am pretty sure that if I talk about why I'm not yet doing automated software testing, it will sound like I'm making excuses. See how this goes? So, I avoid the topic, even though it's a juicy relevant topic. And, of course, I want to talk about it, so it keeps percolating. 

So, now that I'm inspired... I'm going to dive in. I bought Jeffrey Way's "Testing Laravel" book, I studied his podcasts. I fired up phpunit. And, you know something, every time I dove in, it blew up on me. Frankly, it was frustrating -- really frustrating. Literally, days after I hit the issue of middleware interfering with phpunit, I saw this issue addressed in Laravel. Which confirmed that I was on the right track. But it confirmed something else to me: that this was an intensive topic that had a unique chicken-and-egg problem. Automated software testing is going to change how I program. For the better, of course. So it's not something I learn and just add to my inventory of Things Learned. It's going to change how I program features. Unfortunately, learning automated software testing is going to take some time. Plus, it's a moving target because I use the Laravel Framework, and there were impediments. Impediments which I am sure do not exist now -- in fact, I am sure there are additions now to Laravel that make the kind of testing I need to do much more do-able. 

The way I see it, automated software testing is a long-term skill. It's not something that I'm going to learn, and then magically absorb, and then forever after I can do automated software testing. Which, BTW, I am testing my code, but not automating the tests. It's a classic scenario, I know. Automated software testing is something that I'll be improving and enlarging my competency over months and months and years and years. 

I need to learn automated software testing, which will change the way I program. I need to change my programming in order to test my code in an automated fashion. So, at what point do I need to learn automated software testing so my programming changes that supports automated testing? 

My conclusion is: not really a whole lot. What I really need to do is get over the initial hurdle. Then, from there, make learning automated testing a professional way of life. I have to learn automated testing in real-time, and then in real-time, while I am learning the finer art and science of automated testing, adjust my code in real-time to what I am learning so that my code supports automated testing. 

So, there is no doubt whatsoever in my mind that once I am over the initial hurdle that my coding is going to slow down, from its already not-exactly-blistering-fast pace. 

There is also no doubt in my mind that I need to stop all my programming in order to get over that initial hurdle. My sense tells me that I need two weeks, maybe three. And, perhaps, what I really need is a week or so of intense struggling to catch on; followed by a week of intense struggling doing real programming in real-time that support automated testing, followed by months of hitting all the real life issues as they come up and dealing with 'em all properly as they come up. 

I've read the books. I've seen the podcasts. I might as well buy the t-shirt. But the issues that I see I cannot really understand until I am dealing with them. How do I test sending an email? It's pretty logical in the podcast. But when I am staring at my own code, I am lost. 

It really hasn't helped that my real life programming is a bit more complicated than the introductory books and podcasts. I have dozens of inter-related packages, which is not a scenario typically dealt with in the textbooks. 

Looking back, I should have, perhaps, carved out some time every day or so, to just learn what I needed to learn to get going with automated testing. Just carve out time for the journey, without it impacting my real programming whatsoever. I've put the pressure on myself feed back as soon as possible my learning back into my code, so my code supports automated software testing. I think, perhaps, looking back, that I should have separated out the initial learning from real life usage. 

I know that I am spending too much time on my ad-hoc testing. How do I know that I am spending "too much" time on it? What is meant by "too much" time? Well, I have a sense of it, in part because I am repeating ad-hoc tests. I am saying to myself whilst programming more than I care to admit that "I wish I could just run an automated test" instead of re-setting up the ad hoc test. However, I am literally frightened at the prospect of giving a shot at setting up that automated test because I just know -- know! -- that it is really a rabbit hole, and I really would rather get The Job Done rather than get into Yet Another Journey of Discovery (YAJoD-TM). 

I quite enjoyed the recent Freakonomics podcasts about becoming good at things (links below). The gist is if you want to be good at something, you have to keep at it. And, you have to be very purposeful too, including very purposefully breaking out of your comfort zone in order to get through the struggling and failing that need to happen to achieve real progress. 

I was thinking while listening to these podcasts that I would not even be at this point if I had not made a huge investment in learning, and gone through the Journey of Discovery (JoD-TM) with, Laravel, composer, git and, (PSR-4) namespaces, (PSR-2) coding standards, removing switch statements from my regular programming, removing else statements whenever I can, moving closer to SOLID coding priciples, etc. For me, this all had to happen first. Otherwise, I'd not have the base with which to structure my code in a way that supports automated software testing. So a quick pat on the back!



Enjoy a profitable month!


You have been listening to a production. Opinions expressed are not necessarily those of SouthLaSalleMEDIA dot com, nor of the organizations represented. Links and materials discussed on air are available in the Show Notes for this show. Information contained herein have been obtained from sources believed to be reliable, but are not guaranteed. Podcasts are released under a creative commons licence. Some rights are reserved. Email correspondence to the attention of Bob Bloom at info at SouthLaSalleMedia dot com.

Monthly report on the good, the bad, and the ugly of my ongoing LaSalle Software development. Produced by Bob Bloom, founder and developer of LaSalle Software.

      All Episodes