Hello our valued visitor, We present you the best web solutions and high quality graphic designs with a lot of features. just login to your account and enjoy ...

<none>

Hello our valued visitor, We present you the best web solutions and high quality graphic designs with a lot of features. just login to your account and enjoy ...

CAPTCHA
This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.
1 + 2 =
Solve this simple math problem and enter the result. E.g. for 1+3, enter 4.

Tech News

News ID Title News Details
180 Drupal 8's Allowed Formats module

Drupal 8's text format system provides a way for developers to manage user generated content, regardless of if the user is trusted staff member or an anonymous commenter. With intelligent configuration of various text formats for the various roles on the site, the security and usefulness of a site can be maximized.

Much like Drupal 7's Better Formats module, Drupal 8's Allowed Formats module allows a developer to limit the text formats that are available on the field level. Normally, all formats that the user has access to are available on all fields, this module allows developers to force a particular format to be applied on a particular field.

The configuration is available on all formatted fields' "edit" page. For example, take this configuration for a "short summary" field.

In this example, only the "Minimal HTML" format will be available for this field, regardless of if the current user has permission to use any of the other formats. For this application, the "short summary" field can only have links, strong text, and emphasized text. By limiting authors to only the "Minimal HTML" text format for this field, these parameters can be easily enforced.

185 DrupalEasy Podcast 214 -Travis Carden - Drupal Spec Tool

Direct .mp3 file download.

Travis Carden, (traviscarden), Senior Software Engineer at Acquia joins Mike Anello to talk about the spreadsheet-based Drupal Spec Tool, a very cool tool that allows teams to specify different parts of a Drupal site and then generates diagrams and Behat tests.

Interview DrupalEasy News Upcoming events Sponsors
  • Drupal Aid - Drupal support and maintenance services. Get unlimited support, monthly maintenance, and unlimited small jobs starting at $99/mo.
  • WebEnabled.com - devPanel.
Follow us on Twitter Subscribe

Subscribe to our podcast on iTunes, Google Play or Miro. Listen to our podcast on Stitcher.

If you'd like to leave us a voicemail, call 321-396-2340. Please keep in mind that we might play your voicemail during one of our future podcasts. Feel free to call in with suggestions, rants, questions, or corrections. If you'd rather just send us an email, please use our contact page.

187 DrupalEasy Podcast 215 - Kaleem Clarkson - Drupal Event Organizers Working Group

Direct .mp3 file download.

Kaleem Clarkson, Operations Manager and Front-End Drupal Developer at Kennesaw State University and Drupal developer at blend me inc. Listen in as they discuss the newly formed Drupal Event Organizers Group and Mike breaks bad news about the Marvel Universe to Kaleem.

Discussion DrupalEasy News Upcoming Events Sponsors Follow us on Twitter Subscribe

Subscribe to our podcast on iTunes, Google Play or Miro. Listen to our podcast on Stitcher.

If you'd like to leave us a voicemail, call 321-396-2340. Please keep in mind that we might play your voicemail during one of our future podcasts. Feel free to call in with suggestions, rants, questions, or corrections. If you'd rather just send us an email, please use our contact page.

189 Importing large databases to a Pantheon environment

Code flows up, data flows down.

I repeat this phrase in just about every workshop I teach - it is one of the basic principles of being a professional web developer. The idea is that we should be working locally, then pushing our changes (using Git) up to the project's dev, then QA, the live environments. As for the project's data (database and files directory for Drupal sites), the direction is opposite, we should only be moving data "down" - from live to QA, or live to dev, or live to local.

There are, of course, exceptions to every rule, and certainly in this case as well.

One exception is when the project is just getting started. Consider the example where you've started a new project on your local, you've reached the first milestone of development and are ready to move everything to a shared development environment where the client can catch their first glimpse of the project. In this case, you'll likely be moving everything "up" - code, database, and files. 

I had this exact scenario recently, I was migrating a rather large site to Drupal and had the initial migration looking good, and was in the process of getting it up-and-running on Pantheon. I successfully pushed the code as well as SFTPd the 1.6GB files directory to the Pantheon dev environment. The database was a bit larger than the 100MB maximum Pantheon allows to be uploaded through the browser, so I was using their "URL" method.

My plan was to put the database dump in a public Dropbox folder, then copy/paste the URL of the file in the Pantheon Dashboard interface. Unfortunately, it didn't work. I tried both .sql and .sql.gz formats, I tried doing the database import using Terminus (Pantheon's command-line interface) - each time I was provided with either no error message, or one that wasn't very helpful.

The solution? Turns out it is a bit of a DropBox issue, albeit one that is pretty easy to fix.

When copying/pasting the URL for a public file on DropBox, the URL ends in dl=0 - turns out that this prevents Pantheon from being able to import the file. Simply change it to dl=1 and the problem is solved (this works in both the Dashboard and Terminus)!

181 New book: Local Web Development with DDEV Explained

It's no secret that I'm a fan of Drud Technology's DDEV-Local web development tool. I selected it as my local development tool of choice for both my clients and my Drupal Career Online students after an exhaustive search. I've been teaching monthly 2-hour online workshops getting folks up-and-running with DDEV, and I've taught numerous full day "Getting started with DDEV" workshops at various Drupal events around the United States.

Since I've been writing, testing, and refining curriculum related to DDEV for well over a year now, it made sense to take everything I've learned and put it in a format that makes it available to even more folks looking to easily adopt a professional local development environment. I'm super-happy to announce that the book is now available for purchase on amazon.com at a price designed to get it into as many hands as possible - just $5.99 for a digital copy and $9.99 for the dead tree edition.

This first book, Local Web Development with DDEV Explained, is the result of a partnership wtih Steve Burge and the rest of the fine folks at OSTraining, which is the publisher. They've allowed me to retain full control of the book while at the same time tapping into OSTraining's extensive experience in publishing and marketing books related to open source content management systems. 

The book covers the full range of topics related to local web development and DDEV. Topics covered include:

  • Why a professional local development environment is important.
  • What a professional local development workflow looks like.
  • Installing DDEV on Mac OS X, Windows 10, and Ubuntu.
  • Step-by-step example of starting a new Drupal 8 project with Composer and DDEV.
  • Step-by-step example of getting an existing Drupal project up-and-running with DDEV.
  • Adding a Solr container.
  • Common workflows using DDEV.
  • Extending DDEV with hooks.
  • Using Xdebug with DDEV and PhpStorm.

The bulk of the book's content is straight from my training curriculum, so you can be sure that it is tried-and-true, and, as always, reflects only best practices. My goal is always to teach the right way to accomplish a task - no hacks or shortcuts.

My goal is to update the book several times per year, with a list of topics for the first revision already growing. I'll be starting on it in the next few days! By purchasing a digital copy, you'll automatically get updates to the book as they're released. 

190 Drupal 8 Migrate - importing clean data via .csv file

One of the pillars of the consulting side of the work we do here at DrupalEasy is data migration into Drupal sites. Over the past few years, we've been focused on migrating data into Drupal 8 using the most excellent core migrate modules along with contrib modules like Migrate Tools, Migrate Plus, and Migrate Source CSV

If there's one thing I've learned over the years is that data to be migrated is never, ever, ever, ever, never as clean as it is claimed to be. There's always something that has to be massaged during the migration process in order to get things working.

One of the most common things I see when migrating data from a .csv is strings with trailing spaces. If you take a cursory look at the data in a spreadsheet, you might see something like "Bread", but if you look at the same data in a text .csv file, you'll see that the string is actually "Bread " (trailing space). If you're migrating this field to a vocabulary using the entity_lookup process plugin, that trailing space will cause the term to not be found, and therefore not migrated.

The solution? Well, you could clean up the data, but there's actually a much easier solution that I use by default on almost all string data being migrated - I use the "callback" plugin to in-turn call the PHP trim() function on incoming strings in the "process" section of the migration configuration. Here's an example:

field_topics: - plugin: callback callable: trim source: Topic - plugin: entity_lookup entity_type: taxonomy_term bundle: topics bundle_key: vid value_key: name ignore_case: true

Using this method allows for the incoming data to be a little dirty without affecting the migration.

182 DrupalEasy Podcast 212 - Commerce Guys: decoupling and roadmap with Bojan Zivanovic and Matt Glaman

Direct .mp3 file download.

Matt Glaman, (mglaman) and Bojan Zivanovic, (bojanz) join Mike live from Disney World to talk about decoupling Drupal Commerce as well as the roadmap for Drupal Commerce as a project. We take a quick side trip into some blog posts Matt recently wrote about running all of Drupal core's automated tests in DDEV-Local.

Interview DrupalEasy News Upcoming events Sponsors
  • Drupal Aid - Drupal support and maintenance services. Get unlimited support, monthly maintenance, and unlimited small jobs starting at $99/mo.
  • WebEnabled.com - devPanel.
Follow us on Twitter Subscribe

Subscribe to our podcast on iTunes, Google Play or Miro. Listen to our podcast on Stitcher.

If you'd like to leave us a voicemail, call 321-396-2340. Please keep in mind that we might play your voicemail during one of our future podcasts. Feel free to call in with suggestions, rants, questions, or corrections. If you'd rather just send us an email, please use our contact page.

191 Discovering the most excellent format_date Drupal 8 migrate process plugin

I often work on migrations that involved dates that need to be migrated into Drupal 8. While many times the dates are for entity created and updated dates, and therefore in Unix timestamp format, sometimes (when migrating events, for example), I'll need to migrate a date in some other format into Drupal's date field.

In the past, I've ended up writing a custom process plugin or callback to convert the date into the proper format for Drupal core's date field, but I recently discovered the format_date process plugin (I'm pretty sure I'm late to the party on this) and realized I was doing more work than I had to. 

The short version is this - the format_date plugin takes a date in any format (as long as it can be described using the standard PHP Date patterns) and converts it to the format Drupal requires. Oh, and it also has timezone support!

Here's an example of taking a datetime in the format of "11/04/18 08:30:00" (EST) and converting it to the format that Drupal's core date field requires, "2019-11-04T08:30:00" (UTC).

field_eventdate/value: plugin: format_date from_format: 'm/d/y H:i:s' to_format: 'Y-m-d\TH:i:s' from_timezone: 'America/New_York' to_timezone: 'UTC'

It's pretty simple! Less custom code usually means less opportunities for mistakes and less code to maintain in the future!

183 Automatic removal of .git directories from Composer dependencies

If you've adopted a Composer-based Drupal 8 workflow (hopefully using the Drupal Composer/Drupal Project template) where you're keeping dependencies in your project's repository, then you've no-doubt experienced the annoyance of a rouge .git directory ending up in one of your project's dependencies. This will always happen when you're using the -dev version of a Drupal module. 

For example, as of the authoring of this Quicktip, the Field Redirection module does not yet have a stable release for Drupal 8. When added to a project using Composer, the results look like this:

Michaels-MacBook-Pro:dcoweek5 michael$ ddev composer require drupal/field_redirection
Executing [composer require drupal/field_redirection] at the project root (/var/www/html in the container, /Users/michael/sites/dcoweek5 on the host)
Using version 2.x-dev for drupal/field_redirection
./composer.json has been updated > DrupalProject\composer\ScriptHandler::checkComposerVersion
Loading composer repositories with package information
Updating dependencies (including require-dev)
Package operations: 1 install, 0 updates, 0 removals
  - Installing drupal/field_redirection (dev-2.x e1c30f2): Cloning e1c30f24f9 from cache
Writing lock file

Notice on the "Installing drupal/field_redirection..." line, it indicates that the project is cloned, not downloaded. This means that a .git directory has been created in the Field Redirection directory.

Note that I'm calling Composer as "ddev composer ..." - this is because I use DDEV as my local development environment and am utilizing its built-in Composer command.

If this goes unnoticed, and you attempt to do a normal "git add/commit" workflow for the new module, you'll end up with a somewhat-friendly Git message indicating that you now have a Git submodule.

Unfortunately, Git submodules aren't normally necessary nor wanted when you are committing dependencies to the project repository. So, the typical solution is to delete the .git directory of the dependency prior to performing the "git add/commit".

Luckily, there's an easier way! Travis Neilans recently pointed me in the direction of the Composer Cleanup VCS Directories project. By adding this as a dependency of your project, any .git directories that result from adding project dependencies will be automatically removed! First, install the Composer Cleanup VCS Directories project using:

composer require topfloor/composer-cleanup-vcs-dirs

Then, anytime you use "composer require" to install a project dependency, if there's a .git directory, you'll see a message indicating that it has been automatically removed.

Deleting .git directory from /var/www/html/web/modules/contrib/field_redirection/.git

195 Some of my favorite (newer) DDEV things

Local development environments are in the midst a bit of a renaissance recently - mainly driven by the maturation and adoption of Docker-based solutions.

I've been using (and recommending) DDEV for awhile now, and one of the things that I really like about it is the consistent pace of development. Since early February, there have been three minor releases of DDEV (1.6, 1.7, and 1.8). With each minor release of DDEV comes new, often very useful features. Here's just a few of my recent favorites:

NFS Mounting

One of the few disadvantages of using a Docker-based solution over a native local development solution is often the performance (depending on your operating system and hardware). In the DDEV 1.6 release, NFS mounting was introduced - this is a method to mount the DDEV Docker containers using NFS instead of the default Docker mount - resulting in significant performance gains. While using NFS mounting does involve a one-time system -setup, the results are well worth it.

Windows Chocolatey support

For Windows users, Chocolatey is similar to Homebrew for Mac OS X. With DDEV 1.6, you can now install DDEV using Chocolatey from the command line.

DrupalEasy has live, online, monthly 2-hour DDEV workshops - the next one is June 12, 2019. Local DDEV config files

If you're working in a team environment, then having a local DDEV config file is a huge advantage. Prior to DDEV 1.7, if you wanted to utilize a DDEV post-start hook, it had to be configured in .ddev/config.yaml. In a team environment, this file is shared among all developers, so everyone would share the same post-start hook (even if they didn't want it). Starting in DDEV 1.7, you can have your own .ddev/config.local.yaml with only your additions or modifications to .ddev/config.yaml. For example, if you want to add a post-start hook and not share it with the rest of your team, just create a .ddev/config.local.yaml file and add it there.

Easy local https by default!

It is pretty much standard practice these days to have your production environment only available via https. It only makes sense that your local development environments should behave in the same manner. In DDEV 1.8, support for the most-excellent mkcert project was added, so with a one-time, super-easy mkcert installation on your host operating system, DDEV will automatically default to providing you with an https connection to your local development environment. 

There's a lot of great reasons to use DDEV (check out more of them in my DDEV book!), and it is exciting to know that every six weeks or so, we'll be getting new ones with each DDEV release.

Pages

You are here