123456789_123456789_123456789_123456789_123456789_

Module: Sprockets::Loader

Relationships & Source Files
Extension / Inclusion / Inheritance Descendants
Included In:
Super Chains via Extension / Inclusion / Inheritance
Instance Chain:
Defined in: lib/sprockets/loader.rb

Overview

The loader phase takes a asset URI location and returns a constructed Asset object.

Constant Summary

ProcessorUtils - Included

VALID_METADATA_COMPOUND_TYPES, VALID_METADATA_COMPOUND_TYPES_HASH, VALID_METADATA_TYPES, VALID_METADATA_VALUE_TYPES, VALID_METADATA_VALUE_TYPES_HASH

PathUtils - Included

SEPARATOR_PATTERN

DigestUtils - Included

ADD_VALUE_TO_DIGEST, DIGEST_SIZES, HASH_ALGORITHMS

Utils - Included

MODULE_INCLUDE_MUTEX, WHITESPACE_ORDINALS

Instance Method Summary

Mime - Included

#mime_exts

Internal: Mapping of MIME extension Strings to MIME type Strings.

#mime_type_charset_detecter

Internal: Get detecter function for MIME type.

#mime_types

Public: Mapping of MIME type Strings to properties Hash.

#read_file

Public: Read file on disk with MIME type specific encoding.

#register_mime_type

Public: Register a new mime type.

Processing - Included

#bundle_processors

Bundle Processors are ran on concatenated assets rather than individual files.

#pipelines,
#postprocessors

Postprocessors are ran after Preprocessors and Engine processors.

#preprocessors

Preprocessors are ran before Postprocessors and Engine processors.

#processors
#register_bundle_metadata_reducer

Public: Register bundle metadata reducer function.

#register_bundle_processor

Registers a new Bundle Processor klass for mime_type.

#register_pipeline

Registers a pipeline that will be called by call_processor method.

#register_postprocessor

Registers a new Postprocessor klass for mime_type.

#register_preprocessor

Registers a new Preprocessor klass for mime_type.

#register_processor
#unregister_bundle_processor

Remove Bundle Processor klass for mime_type.

#unregister_postprocessor

Remove Postprocessor klass for mime_type.

#unregister_preprocessor

Remove Preprocessor klass for mime_type.

#unregister_processor
#register_config_processor, #unregister_config_processor

Resolve - Included

#resolve

Public: Find Asset URI for given a logical path by searching the environment’s load paths.

#resolve!

Public: Same as resolve() but raises a FileNotFound exception instead of nil if no assets are found.

PathDependencyUtils - Included

#entries_with_dependencies

Internal: List directory entries and return a set of dependencies that would invalid the cached return result.

#stat_directory_with_dependencies

Internal: List directory filenames and associated Stats under a directory.

#stat_sorted_tree_with_dependencies

Internal: List directory filenames and associated Stats under an entire directory tree.

Transformers - Included

#compose_transformers

Internal: Compose multiple transformer steps into a single processor function.

#expand_transform_accepts

Internal: Expand accept type list to include possible transformed types.

#register_transformer

Public: Register a transformer from and to a mime type.

#register_transformer_suffix

Internal: Register transformer for existing type adding a suffix.

#resolve_transform_type

Internal: Resolve target mime type that the source type should be transformed to.

#transformers

Public: Two level mapping of a source mime type to a target mime type.

#compose_transformer_list, #compute_transformers!

HTTPUtils - Included

#find_best_mime_type_match

Internal: Find the best qvalue match from an Array of available mime type options.

#find_best_q_match

Internal: Find the best qvalue match from an Array of available options.

#find_mime_type_matches

Internal: Find the all qvalue match from an Array of available mime type options.

#find_q_matches

Internal: Find all qvalue matches from an Array of available options.

#match_mime_type?

Public: Test mime type against mime range.

#match_mime_type_keys

Public: Return values from Hash where the key matches the mime type.

#parse_q_values

Internal: Parse Accept header quality values.

Utils - Included

#concat_javascript_sources

Internal: Accumulate asset source to buffer and append a trailing semicolon if necessary.

#dfs

Internal: Post-order Depth-First search algorithm.

#dfs_paths

Internal: Post-order Depth-First search algorithm that gathers all paths along the way.

#duplicable?

Internal: Check if object can safely be .dup’d.

#hash_reassoc

Internal: Duplicate and store key/value on new frozen hash.

#hash_reassoc1

Internal: Duplicate and store key/value on new frozen hash.

#module_include

Internal: Inject into target module for the duration of the block.

#string_end_with_semicolon?

Internal: Check if string has a trailing semicolon.

DigestUtils - Included

#already_digested?

Internal: Checks an asset name for a valid digest.

#detect_digest_class

Internal: Detect digest class hash algorithm for digest bytes.

#digest

Internal: Generate a hexdigest for a nested JSON serializable object.

#digest_class

Internal: Default digest class.

#hexdigest

Internal: Generate a hexdigest for a nested JSON serializable object.

#hexdigest_integrity_uri

Public: Generate hash for use in the integrity attribute of an asset tag as per the subresource integrity specification.

#integrity_uri

Public: Generate hash for use in the integrity attribute of an asset tag as per the subresource integrity specification.

#pack_base64digest

Internal: Pack a binary digest to a base64 encoded string.

#pack_hexdigest

Internal: Pack a binary digest to a hex encoded string.

#pack_urlsafe_base64digest

Internal: Pack a binary digest to a urlsafe base64 encoded string.

#unpack_hexdigest

Internal: Unpack a hex encoded digest string into binary bytes.

#build_digest

PathUtils - Included

#absolute_path?

On Windows, ALT_SEPARATOR is \ Delegate to Pathname since the logic gets complex.

#atomic_write

Public: Write to a file atomically.

#directory?

Public: Like File.directory?.

#entries

Public: A version of Dir.entries that filters out . files and ~ swap files.

#file?

Public: Like File.file?.

#find_matching_path_for_extensions

Internal: Match paths in a directory against available extensions.

#find_upwards

Internal: Find target basename checking upwards from path.

#join

Public: Joins path to base path.

#match_path_extname

Internal: Match path extnames against available extensions.

#path_extnames

Internal: Get path’s extensions.

#path_parents

Internal: Returns all parents for path.

#paths_split

Internal: Detect root path and base for file in a set of paths.

#relative_path?

Public: Check if path is explicitly relative.

#relative_path_from

Public: Get relative path from start to dest.

#set_pipeline

Public: Sets pipeline for path.

#split_subpath

Internal: Get relative path for root path and subpath.

#stat

Public: Like File.stat.

#stat_directory

Public: Stat all the files under a directory.

#stat_sorted_tree

Public: Recursive stat all the files under a directory in alphabetical order.

#stat_tree

Public: Recursive stat all the files under a directory.

ProcessorUtils - Included

#call_processor

Public: Invoke processor.

#call_processors

Public: Invoke list of processors in right to left order.

#compose_processors

Public: Compose processors in right to left order.

#processor_cache_key

Internal: Get processor defined cached key.

#processors_cache_keys

Internal: Get combined cache keys for set of processors.

#validate_processor_result!

Internal: Validate returned result of calling a processor pipeline and raise a friendly user error message.

URIUtils - Included

#build_asset_uri

Internal: Build Asset URI.

#build_file_digest_uri

Internal: Build file-digest dependency URI.

#encode_uri_query_params

Internal: Serialize hash of params into query string.

#join_file_uri

Internal: Join file: URI component parts into String.

#join_uri

Internal: Join URI component parts into String.

#parse_asset_uri

Internal: Parse Asset URI.

#parse_file_digest_uri

Internal: Parse file-digest dependency URI.

#parse_uri_query_params

Internal: Parse query string into hash of params.

#split_file_uri

Internal: Parse file: URI into component parts.

#split_uri

Internal: Parse URI into component parts.

#valid_asset_uri?

Internal: Check if String is a valid Asset URI.

Instance Method Details

#asset_from_cache(key) (private)

Internal: Load asset hash from cache

key - A String containing lookup information for an asset

This method converts all “compressed” paths to absolute paths. Returns a hash of values representing an asset

[ GitHub ]

  
# File 'lib/sprockets/loader.rb', line 111

def asset_from_cache(key)
  asset = cache.get(key, true)
  if asset
    asset[:uri]       = expand_from_root(asset[:uri])
    asset[:load_path] = expand_from_root(asset[:load_path])
    asset[:filename]  = expand_from_root(asset[:filename])
    expand_key_from_hash(asset[:], :included)
    expand_key_from_hash(asset[:], :links)
    expand_key_from_hash(asset[:], :stubbed)
    expand_key_from_hash(asset[:], :required)
    expand_key_from_hash(asset[:], :to_load)
    expand_key_from_hash(asset[:], :to_link)
    expand_key_from_hash(asset[:], :dependencies) { |uri| uri.start_with?("file-digest://") }

    asset[:].each_key do |k|
      next unless k.match?(/_dependencies\z/) # rubocop:disable Performance/EndWith
      expand_key_from_hash(asset[:], k)
    end
  end
  asset
end

#compress_key_from_hash(hash, key) (private)

[ GitHub ]

  
# File 'lib/sprockets/loader.rb', line 67

def compress_key_from_hash(hash, key)
  return unless hash.key?(key)
  value = hash[key].dup
  return if !value

  if block_given?
    value.map! do |x|
      if yield x
       compress_from_root(x)
      else
       x
      end
    end
  else
    value.map! { |x| compress_from_root(x) }
  end
  hash[key] = value
end

#expand_key_from_hash(hash, key) (private)

[ GitHub ]

  
# File 'lib/sprockets/loader.rb', line 87

def expand_key_from_hash(hash, key)
  return unless hash.key?(key)
  value = hash[key].dup
  return if !value
  if block_given?
    value.map! do |x|
      if yield x
        expand_from_root(x)
      else
        x
      end
    end
  else
    value.map! { |x| expand_from_root(x) }
  end
  hash[key] = value
end

#fetch_asset_from_dependency_cache(unloaded, limit = 3) (private)

Internal: Retrieves an asset based on its digest

unloaded - An UnloadedAsset limit - An Integer which sets the maximum number of versions of “histories”

stored in the cache

This method attempts to retrieve the last limit number of histories of an asset from the cache a “history” which is an array of unresolved “dependencies” that the asset needs to compile. In this case a dependency can refer to either an asset e.g. index.js may rely on jquery.js (so jquery.js is a dependency), or other factors that may affect compilation, such as the VERSION of ::Sprockets (i.e. the environment) and what “processors” are used.

For example a history array may look something like this

[["environment-version", "environment-paths", "processors:type=text/css&file_type=text/css",
  "file-digest:///Full/path/app/assets/stylesheets/application.css",
  "processors:type=text/css&file_digesttype=text/css&pipeline=self",
  "file-digest:///Full/path/app/assets/stylesheets"]]

Where the first entry is a Set of dependencies for last generated version of that asset. Multiple versions are stored since ::Sprockets keeps the last limit number of assets generated present in the system.

If a “history” of dependencies is present in the cache, each version of “history” will be yielded to the passed block which is responsible for loading the asset. If found, the existing history will be saved with the dependency that found a valid asset moved to the front.

If no history is present, or if none of the histories could be resolved to a valid asset then, the block is yielded to and expected to return a valid asset. When this happens the dependencies for the returned asset are added to the “history”, and older entries are removed if the “history” is above limit.

[ GitHub ]

  
# File 'lib/sprockets/loader.rb', line 325

def fetch_asset_from_dependency_cache(unloaded, limit = 3)
  key = unloaded.dependency_history_key

  history = cache.get(key) || []
  history.each_with_index do |deps, index|
    expanded_deps = deps.map do |path|
      path.start_with?("file-digest://") ? expand_from_root(path) : path
    end
    if asset = yield(expanded_deps)
      cache.set(key, history.rotate!(index)) if index > 0
      return asset
    end
  end

  asset = yield
  deps  = asset[:][:dependencies].dup.map! do |uri|
    uri.start_with?("file-digest://") ? compress_from_root(uri) : uri
  end
  cache.set(key, history.unshift(deps).take(limit))
  asset
end

#load(uri)

Public: Load Asset by Asset URI.

uri - A String containing complete URI to a file including schema

and full path such as:
"file:///Path/app/assets/js/app.js?type=application/javascript"

Returns Asset.

[ GitHub ]

  
# File 'lib/sprockets/loader.rb', line 31

def load(uri)
  unloaded = UnloadedAsset.new(uri, self)
  if unloaded.params.key?(:id)
    unless asset = asset_from_cache(unloaded.asset_key)
      id = unloaded.params.delete(:id)
      uri_without_id = build_asset_uri(unloaded.filename, unloaded.params)
      asset = load_from_unloaded(UnloadedAsset.new(uri_without_id, self))
      if asset[:id] != id
        @logger.warn "Sprockets load error: Tried to find #{uri}, but latest was id #{asset[:id]}"
      end
    end
  else
    asset = fetch_asset_from_dependency_cache(unloaded) do |paths|
      # When asset is previously generated, its "dependencies" are stored in the cache.
      # The presence of `paths` indicates dependencies were stored.
      # We can check to see if the dependencies have not changed by "resolving" them and
      # generating a digest key from the resolved entries. If this digest key has not
      # changed, the asset will be pulled from cache.
      #
      # If this `paths` is present but the cache returns nothing then `fetch_asset_from_dependency_cache`
      # will confusingly be called again with `paths` set to nil where the asset will be
      # loaded from disk.
      if paths
        digest = DigestUtils.digest(resolve_dependencies(paths))
        if uri_from_cache = cache.get(unloaded.digest_key(digest), true)
          asset_from_cache(UnloadedAsset.new(uri_from_cache, self).asset_key)
        end
      else
        load_from_unloaded(unloaded)
      end
    end
  end
  Asset.new(asset)
end

#load_from_unloaded(unloaded) (private)

Internal: Loads an asset and saves it to cache

unloaded - An UnloadedAsset

This method is only called when the given unloaded asset could not be successfully pulled from cache.

[ GitHub ]

  
# File 'lib/sprockets/loader.rb', line 139

def load_from_unloaded(unloaded)
  unless file?(unloaded.filename)
    raise FileNotFound, "could not find file: #{unloaded.filename}"
  end

  path_to_split =
    if index_alias = unloaded.params[:index_alias]
      expand_from_root index_alias
    else
      unloaded.filename
    end

  load_path, logical_path = paths_split(config[:paths], path_to_split)

  unless load_path
    target = path_to_split
    target += " (index alias of #{unloaded.filename})" if unloaded.params[:index_alias]
    raise FileOutsidePaths, "#{target} is no longer under a load path: #{self.paths.join(', ')}"
  end

  extname, file_type = match_path_extname(logical_path, mime_exts)
  logical_path = logical_path.chomp(extname)
  name = logical_path

  if pipeline = unloaded.params[:pipeline]
    logical_path += ".#{pipeline}"
  end

  if type = unloaded.params[:type]
    extensions = config[:mime_types][type][:extensions]
    extension = extensions.include?(extname) ? extname : extensions.first
    logical_path += extension
  end

  if type != file_type && !config[:transformers][file_type][type]
    raise ConversionError, "could not convert #{file_type.inspect} to #{type.inspect}"
  end

  processors = processors_for(type, file_type, pipeline)

  processors_dep_uri = build_processors_uri(type, file_type, pipeline)
  dependencies = config[:dependencies] + [processors_dep_uri]

  # Read into memory and process if theres a processor pipeline
  if processors.any?
    result = call_processors(processors, {
      environment: self,
      cache: self.cache,
      uri: unloaded.uri,
      filename: unloaded.filename,
      load_path: load_path,
      name: name,
      content_type: type,
      metadata: {
        dependencies: dependencies
      }
    })
    validate_processor_result!(result)
    source = result.delete(:data)
     = result
    [:charset] = source.encoding.name.downcase unless .key?(:charset)
    [:digest]  = digest(source)
    [:length]  = source.bytesize
    [:environment_version] = version
  else
    dependencies << build_file_digest_uri(unloaded.filename)
     = {
      digest: file_digest(unloaded.filename),
      length: self.stat(unloaded.filename).size,
      dependencies: dependencies,
      environment_version: version,
    }
  end

  asset = {
    uri: unloaded.uri,
    load_path: load_path,
    filename: unloaded.filename,
    name: name,
    logical_path: logical_path,
    content_type: type,
    source: source,
    metadata: ,
    dependencies_digest: DigestUtils.digest(resolve_dependencies([:dependencies]))
  }

  asset[:id]  = hexdigest(asset)
  asset[:uri] = build_asset_uri(unloaded.filename, unloaded.params.merge(id: asset[:id]))

  store_asset(asset, unloaded)
  asset
end

#resolve_dependencies(uris) (private)

Internal: Resolve set of dependency URIs.

uris - An Array of “dependencies” for example:

["environment-version", "environment-paths", "processors:type=text/css&file_type=text/css",
   "file-digest:///Full/path/app/assets/stylesheets/application.css",
   "processors:type=text/css&file_type=text/css&pipeline=self",
   "file-digest:///Full/path/app/assets/stylesheets"]

Returns back array of things that the given uri depends on For example the environment version, if you’re using a different version of sprockets then the dependencies should be different, this is used only for generating cache key for example the “environment-version” may be resolved to “environment-1.0-3.2.0” for version “3.2.0” of sprockets.

Any paths that are returned are converted to relative paths

Returns array of resolved dependencies

[ GitHub ]

  
# File 'lib/sprockets/loader.rb', line 289

def resolve_dependencies(uris)
  uris.map { |uri| resolve_dependency(uri) }
end

#store_asset(asset, unloaded) (private)

Internal: Save a given asset to the cache

asset - A hash containing values of loaded asset unloaded - The UnloadedAsset used to lookup the asset

This method converts all absolute paths to “compressed” paths which are relative if they’re in the root.

[ GitHub ]

  
# File 'lib/sprockets/loader.rb', line 239

def store_asset(asset, unloaded)
  # Save the asset in the cache under the new URI
  cached_asset             = asset.dup
  cached_asset[:uri]       = compress_from_root(asset[:uri])
  cached_asset[:filename]  = compress_from_root(asset[:filename])
  cached_asset[:load_path] = compress_from_root(asset[:load_path])

  if cached_asset[:]
    # Deep dup to avoid modifying `asset`
    cached_asset[:] = cached_asset[:].dup
    compress_key_from_hash(cached_asset[:], :included)
    compress_key_from_hash(cached_asset[:], :links)
    compress_key_from_hash(cached_asset[:], :stubbed)
    compress_key_from_hash(cached_asset[:], :required)
    compress_key_from_hash(cached_asset[:], :to_load)
    compress_key_from_hash(cached_asset[:], :to_link)
    compress_key_from_hash(cached_asset[:], :dependencies) { |uri| uri.start_with?("file-digest://") }

    cached_asset[:].each do |key, value|
      next unless key.match?(/_dependencies\z/) # rubocop:disable Performance/EndWith
      compress_key_from_hash(cached_asset[:], key)
    end
  end

  # Unloaded asset and stored_asset now have a different URI
  stored_asset = UnloadedAsset.new(asset[:uri], self)
  cache.set(stored_asset.asset_key, cached_asset, true)

  # Save the new relative path for the digest key of the unloaded asset
  cache.set(unloaded.digest_key(asset[:dependencies_digest]), stored_asset.compressed_path, true)
end