Skip to content

Commit

Permalink
Parser upgrades alt -- redo 0.9.0 (#51)
Browse files Browse the repository at this point in the history
* Revert "Add dynamic config & allow vars for non-config builds (#49)"

This reverts commit 006f16f.

We want to reapply these changes and reconcile the conflicts without overwriting anything.

* Merge --cvar into --var

* bugfix: handle adding variables to emptyset

* Bump verion to 0.9.1
  • Loading branch information
briandominick authored Aug 29, 2018
1 parent e5787b1 commit f593bf2
Show file tree
Hide file tree
Showing 3 changed files with 164 additions and 37 deletions.
105 changes: 92 additions & 13 deletions README.adoc
Original file line number Diff line number Diff line change
@@ -1,4 +1,6 @@
= LiquiDoc
:toc: preamble

// tag::overview[]
LiquiDoc is a documentation build utility for true single-sourcing of technical content and data.
It is especially suited for documentation projects with various required output formats, but it is intended for any project with complex, versioned input data for use in docs, user interfaces, and even back-end code.
Expand Down Expand Up @@ -876,24 +878,92 @@ See <<per-build-properties-files>>.

=== Deploy Operations

It's not clear how deeply we will delve into deploy operations, since other build systems (such as rake) would seem far more suitable.
For testing purposes, however, spinning up a local webserver with the same stroke that you build a site is pretty rewarding and time saving.
Mainstream deployment platforms are probebly better suited to tying all your operations together, but we plan to bake a few common operations in to help you get started.
For true build-and-deployment control, consider build tools such as Make, Rake, and Gradle, or deployment tools like Travis CI, CircleCI, and Jenkins.

==== Jekyll Serve

For testing purposes, however, spinning up a local webserver with the same stroke that you build a site is pretty rewarding and time saving, so we'll start there.

For now, this functionality is limited to adding a `--deploy` flag to your `liquidoc` command.
This will attempt to serve files from the *destination* set for the associated Jekyll build.

[WARNING]
Deployment of Jekyll sites is both limited and untested under nonstandard conditions.

==== Algolia Search Indexing for Jekyll

If you're using Jekyll to build sites, LiquiDoc makes indexing your files with the Algolia cloud search service a matter of configuration.
The heavy lifting is performed by the jekyll-algolia plugin, but LiquiDoc can handle indexing even a complex site by using the same configuration that built your HTML content (which is what Algolia actually indexes).

[NOTE]
The <<config-settings-matrix,*Config Settings Matrix*>> has moved to the <<reference,Reference Section>>.
You will need a free community (or premium) link:https://www.algolia.com/users/sign_up/hacker[Algolia account] to take advantage of Algolia's indexing service and REST API.
Simply create a named index, then visit the API Keys to collect the rest of the info you'll need to get going.

Two hard-coding steps are required to prep your source to handle Algolia index pushes.

. Add a block to your main Jekyll configuration file.
+
.Example Jekyll Algolia configuration
[source,yaml]
----
algolia:
application_id: 'your-application-id' # <1>
search_only_api_key: 'your-search-only-api-key' # <2>
extensions_to_index: [adoc] # <3>
----
+
<1> From the top bar of your Algolia interface.
<2> From the API Keys screen of your Algolia interface.
<3> List as many extensions as apply, separated by commas.

. Add a block to your build config.
+
[source,yaml]
----
- action: render
data: globals.yml
builds:
- backend: jekyll
properties:
files:
- _configs/jekyll-global.yml
- _configs/jekyll-portal-1.yml
arguments:
destination: build/site/user-basic
attributes:
portal_term: Guide
search:
index: 'portal-1'
----
+
The `index:` parameter is for the name of the index you are pushing to.
(An Algolia “app” can have multiple “indices”.)
If you have

Now you can call your same LiquiDoc build command with the `--search-index-push` or `--search-index-dry` flags along with the `--search-api-key='your-admin-api-key-here'` argument in order to invoke the indexing operation.
The `--search-index-dry` flag merely tests content packaging, whereas `--search-index-push` connects to the Algolia REST API and attempt to push your content for indexing and storage.

.Example Jekyll Algolia deployment
[source,shell]
----
bundle exec liquidoc -c _configs/build-docs.yml --search-index-push --search-index-api-key='90f556qaa456abh6j3w7e8c10t48c2i57'
----

This operation performs a complete build, including each render operation, before the Algolia plugin processes content and pushes each build to the indexing service, in turn.

[TIP]
To add modern site search for your users, add link:https://community.algolia.com/instantsearch.js/[Algolia's InstantSearch functionality] to your front end!

== Configuring a LiquiDoc Build

In order to seriously explore and instruct this tool's ability to single-source a product's entire docs-integrated codebase, I have set out on a project for LiquiDoc to eat its own proverbial dog food.
That is, I'm using LiquiDoc to document LiquiDoc in a separate repository -- one which treats the LiquiDoc gem repository (_this_ repo) as a Git submodule.
That is, for purposes of coding, _all_ of the *liquidoc-gem* repo's code is accesible by way of an alias path, `products/liquidoc-gem`, from the base of the *liquidoc-docs* project.

[NOTE]
The <<config-settings-matrix,*Config Settings Matrix*>> has moved to the <<reference,Reference Section>>.

=== Gem Repo as Submodule

I am on record as prefering to _keep docs source in the same codebase as that of the product they reference_.
Expand Down Expand Up @@ -988,8 +1058,8 @@ That is, your config files can contain variables, conditionals, and iterative lo
All you have to do is (1) add Liquid tags to your YAML configuration file and (2) either (a) pass at least one _config variable_ to it when running your `liquidoc` command or (b) pass it the `--parseconfig` flag.

Let's explore that second requirement.
If the Liquid markup in your config file expects variables, pass those variables on the `liquidoc` CLI using `--configvar key=value`.
Otherwise, if you are not passing variables to your config, instruct LiquiDoc to parse the config file using the `parseconfig` CLI option.
If the Liquid markup in your config file expects variables, pass those variables on the `liquidoc` CLI using `--var key=value`.
Otherwise, if you are not passing variables to your config, instruct LiquiDoc to parse the config file using the `--parseconfig` CLI flag.
For example, this might be the case if your config merely contains some simple looping functionality to process lots of files.

[[config-variables]]
Expand All @@ -1011,16 +1081,19 @@ Let's first take a look at a sample dynamic configuration to see if we can under

This config file wants to build a product datasheet for a specific product, which it expects to be indicated by a config variable called `product_slug`.

Config variables are passed using the `--configvar varname='var val'` format, where `varname` is any key that exists as a Liquid variable in your config file, and `'var val'` is its value, wrapped in single quotes.
Config variables are passed using the `--var varname='var val'` format, where `varname` is any key that exists as a Liquid variable in your config file, and `'var val'` is its value, wrapped in single quotes.
Let's say in this case, we want to generate the datasheet for the Windows Enterprise edition of our product.

[source,shell]
----
bundle exec liquidoc -c _configs/build-config.yml --configvar product_slug=win-ent
bundle exec liquidoc -c _configs/build-config.yml -v product_slug=win-ent
----

This will cause our dynamic configuration to look for a data block formatted like so: `data/products.yml:product_win-ent`.
So long as our `products.yml` file contains a top-level data structure called `product_win-ent`, we're off to the races.
[NOTE]
The `-v` option is an alias for `--var`.

This will cause our dynamic configuration to look for a data block formatted like so: `data/products.yml:win-ent`.
So long as our `products.yml` file contains a top-level data structure called `win-ent`, we're off to the races.

==== Eliminating Config Variables

Expand Down Expand Up @@ -1080,7 +1153,7 @@ LiquiDoc {{ vars.product.edition }}
</ul>
----

To set the values of `vars.edition` and `vars.env` in the config file, add for instance `--configvar edition=basic --configvar env=staging`
To set the values of `vars.edition` and `vars.env` in the config file, add for instance `--var edition=basic --var env=staging`

==== Constraining Build Options with Dynamic Configuration

Expand Down Expand Up @@ -1113,7 +1186,7 @@ For instance,
{% endif %}
----

With a build config like this, optionally invoking `--configvar recipe=nopdf`, for instance, will suppress the PDF substep during the build routine.
With a build config like this, optionally invoking `--var recipe=nopdf`, for instance, will suppress the PDF substep during the build routine.

==== Generating Starter Files with Dynamic Configs

Expand All @@ -1125,7 +1198,7 @@ Once you're comfortable with the concept of <<dynamic-config,dynamic LiquiDoc co
.Example liquidoc execution with a dynamic config file
[source,shell]
----
bundle exec liquidoc -c _configs/init_topic.yml --configvar slug=some_c_slug-string --configvar title='Some Topic Title for Publication'
bundle exec liquidoc -c _configs/init_topic.yml --var slug=some_c_slug-string --var title='Some Topic Title for Publication'
----

The example above commands an extraordinary LiquiDoc build routine.
Expand All @@ -1149,7 +1222,7 @@ The configuration file, `init_topic.yml`, creates topic-file stubs and schema-fi
----

As you can see, since this file has Liquid variables embedded in it, we must pass those variables during CLI execution in order for the build to work at all.
This config file is parsed just like any standard parse action, though it uses the `--configvar` option to ingest environment variables, scoped as `vars.` in the template (remember, in this case _the config file itself is the template_).
This config file is parsed just like any standard parse action, though it uses the `--var` option to ingest environment variables, scoped as `vars.` in the template (remember, in this case _the config file itself is the template_).
Parsing the configuration will essentially be the first act of the configured build routine, which will then run the parsed file, step by step.

.Example _parsed_ config file for topic-file stub generation
Expand Down Expand Up @@ -1303,6 +1376,12 @@ s| properties
| N/A
| Optional
|

s| search
| N/A
| N/A
| Optional
|
|===

pass:[*]The `output` setting is considered optional for render operations because static site generations target a directory set in the SSG's config file.
Expand Down
94 changes: 71 additions & 23 deletions lib/liquidoc.rb
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
require 'liquidoc'
require 'optparse'
require 'yaml'
require 'json'
require 'optparse'
require 'liquid'
require 'asciidoctor'
require 'asciidoctor-pdf'
Expand Down Expand Up @@ -51,6 +51,8 @@
@verbose = false
@quiet = false
@explicit = false
@search_index = false
@search_index_dry = ''

# Instantiate the main Logger object, which is always running
@logger = Logger.new(STDOUT)
Expand Down Expand Up @@ -344,6 +346,7 @@ def initialize build, type
build['props'] = build['properties'] if build['properties']
@build = build
@type = type
@build['variables'] = {}
end

def template
Expand Down Expand Up @@ -445,6 +448,18 @@ def prop_files_array
# props['files'].force_array if props['files']
# end

def search
props['search']
end

def add_search_prop! prop
begin
self.search.merge!prop
rescue
raise "PropertyInsertionError"
end
end

# NOTE this section repeats in Class.AsciiDocument
def attributes
@build['attributes']
Expand Down Expand Up @@ -609,11 +624,7 @@ def get_data datasrc

# Pull in a semi-structured data file, converting contents to a Ruby hash
def ingest_data datasrc
# Must be passed a proper data object (there must be a better way to validate arg datatypes)
unless datasrc.is_a? Object
raise "InvalidDataObject"
end
# This proc should really begin here, once the datasrc object is in order
raise "InvalidDataObject" unless datasrc.is_a? Object
case datasrc.type
when "yml"
begin
Expand Down Expand Up @@ -677,7 +688,7 @@ def parse_regex data_file, pattern
end
end
end
output = {"data" => records}
output = records
rescue Exception => ex
@logger.error "Something went wrong trying to parse the free-form file. #{ex.class} thrown. #{ex.message}"
raise "Freeform parse error"
Expand Down Expand Up @@ -911,8 +922,8 @@ def generate_site doc, build
when "jekyll"
attrs = doc.attributes
build.add_config_file("_config.yml") unless build.prop_files_array
jekyll_config = YAML.load_file(build.prop_files_array[0]) # load the first Jekyll config file locally
attrs.merge! ({"base_dir" => jekyll_config['source']}) # Sets default Asciidoctor base_dir to == Jekyll root
jekyll = load_jekyll_data(build) # load the first Jekyll config file locally
attrs.merge! ({"base_dir" => jekyll['source']}) # Sets default Asciidoctor base_dir to == Jekyll root
# write all AsciiDoc attributes to a config file for Jekyll to ingest
attrs.merge!(build.attributes) if build.attributes
attrs = {"asciidoctor" => {"attributes" => attrs} }
Expand All @@ -925,14 +936,31 @@ def generate_site doc, build
if build.props['arguments']
opts_args = build.props['arguments'].to_opts_args
end
command = "bundle exec jekyll build --config #{config_list} #{opts_args} #{quiet}"
base_args = "--config #{config_list} #{opts_args}"
command = "bundle exec jekyll build #{base_args} #{quiet}"
if @search_index
# TODO enable config-based admin api key ingest once config is dynamic
command = algolia_index_cmd(build, @search_api_key, base_args)
@logger.warn "Search indexing failed." unless command
end
end
if command
@logger.info "Running #{command}"
@logger.debug "AsciiDoc attributes: #{doc.attributes.to_yaml} "
system command
end
@logger.info "Running #{command}"
@logger.debug "AsciiDoc attributes: #{doc.attributes.to_yaml} "
system command
jekyll_serve(build) if @jekyll_serve
end

def load_jekyll_data build
data = {}
build.prop_files_array.each do |file|
settings = YAML.load_file(file)
data.merge!settings if settings
end
return data
end

# ===
# DEPLOY procs
# ===
Expand All @@ -948,6 +976,20 @@ def jekyll_serve build
system command
end

def algolia_index_cmd build, apikey=nil, args
unless build.search and build.search['index']
@logger.warn "No index configuration found for build; jekyll-algolia operation skipped for this build."
return false
else
unless apikey
@logger.warn "No Algolia admin API key passed; skipping jekyll-algolia operation for this build."
return false
else
return "ALGOLIA_INDEX_NAME='#{build.search['index']}' ALGOLIA_API_KEY='#{apikey}' bundle exec jekyll algolia #{@search_index_dry} #{args} "
end
end
end

# ===
# Text manipulation Classes, Modules, procs, etc
# ===
Expand Down Expand Up @@ -1115,7 +1157,7 @@ def regexreplace input, regex, replacement=''
@quiet = true
end

opts.on("--explicit", "Log explicit step descriptions to console as build progresses. (Otherwise writes to file at #{@build_dir}/pre/config-explainer.adoc .)") do |n|
opts.on("--explain", "Log explicit step descriptions to console as build progresses. (Otherwise writes to file at #{@build_dir}/pre/config-explainer.adoc .)") do |n|
explainer_init("STDOUT")
@explainer.level = Logger::INFO
@logger.level = Logger::WARN # Suppress all those INFO-level messages
Expand All @@ -1130,21 +1172,27 @@ def regexreplace input, regex, replacement=''
@jekyll_serve = true
end

opts.on("--var KEY=VALUE", "For passing variables directly to the 'vars.' scope template via command line, for non-config builds only.") do |n|
pair = {}
k,v = n.split('=')
pair[k] = v
@passed_vars.merge!pair
opts.on("--search-index-push", "Runs any search indexing configured in the build step and pushes to Algolia.") do
@search_index = true
end

opts.on("--search-index-dry", "Runs any search indexing configured in the build step but does NOT push to Algolia.") do
@search_index = true
@search_index_dry = "--dry-run"
end

opts.on("--search-api-key=STRING", "Passes Algolia Admin API key (which you should keep out of Git).") do |n|
@search_api_key = n
end

opts.on("-x", "--cvar KEY=VALUE", "For sending variables to the 'vars.' scope of the config file and triggering Liquid parsing of config.") do |n|
opts.on("-v", "--var KEY=VALUE", "For passing variables directly to the 'vars.' scope of a template; for dynamic configs, too.") do |n|
pair = {}
k,v = n.split('=')
pair[k] = v
@passed_configvars.merge!pair
@passed_vars.merge!pair
end

opts.on("--parse-config", "Preprocess the designated configuration file as a Liquid template. Superfluous when passing -x/--cvar arguments.") do
opts.on("--parse-config", "Preprocess the designated configuration file as a Liquid template. Superfluous when passing -v/--var arguments.") do
@parseconfig = true
end

Expand Down Expand Up @@ -1176,5 +1224,5 @@ def regexreplace input, regex, replacement=''
end
else
@logger.debug "Executing... config_build"
config_build(@config_file, @passed_configvars, @parseconfig)
config_build(@config_file, @passed_vars, @parseconfig)
end
2 changes: 1 addition & 1 deletion lib/liquidoc/version.rb
Original file line number Diff line number Diff line change
@@ -1,3 +1,3 @@
module Liquidoc
VERSION = "0.9.0"
VERSION = "0.9.1"
end

0 comments on commit f593bf2

Please sign in to comment.