Result:
found more than 672 distributions - search limited to the first 2001 files matching your query ( run in 0.572 )


IO-Compress-Zstd

 view release on metacpan or  search on metacpan

lib/IO/Compress/Zstd.pm  view on Meta::CPAN


Here are a few example that show the capabilities of the module.

=head3 Streaming

This very simple command line example demonstrates the streaming capabilities of the module.
The code reads data from STDIN, compresses it, and writes the compressed data to STDOUT.

    $ echo hello world | perl -MIO::Compress::Zstd=zstd -e 'zstd \*STDIN => \*STDOUT' >output.zst

The special filename "-" can be used as a standin for both C<\*STDIN> and C<\*STDOUT>,

lib/IO/Compress/Zstd.pm  view on Meta::CPAN


=head2 Examples

=head3 Streaming

This very simple command line example demonstrates the streaming capabilities
of the module. The code reads data from STDIN or all the files given on the
commandline, compresses it, and writes the compressed data to STDOUT.

    use strict ;
    use warnings ;

 view all matches for this distribution


IO-FDPass

 view release on metacpan or  search on metacpan

FDPass.pm  view on Meta::CPAN

   $fd >= 0 or die "recv failed: $!";

=head1 DESCRIPTION

This small low-level module only has one purpose: pass a file descriptor
to another process, using a (streaming) unix domain socket (on POSIX
systems) or any (streaming) socket (on WIN32 systems). The ability to pass
file descriptors on windows is currently the unique selling point of this
module. Have I mentioned that it is really small, too?

=head1 FUNCTIONS

 view all matches for this distribution


IO-Mark

 view release on metacpan or  search on metacpan

lib/IO/Mark.pm  view on Meta::CPAN

    }

You could buffer the entire image in a file, open the file and pass that
handle to C<get_image_size>. That works but means that we can't compute
the image size until we have the whole image. If instead of an image
file we were dealing with streaming audio the input stream might be
effectively infinite - which would make caching it in a file
inconvenient.

We could rewrite C<get_image_size> to cache whatever data it reads from
the socket. Then we could send that data before sending the remainder of

 view all matches for this distribution


IO-Socket-Multicast

 view release on metacpan or  search on metacpan

lib/IO/Socket/Multicast.pm  view on Meta::CPAN


This module requires IO::Interface version 0.94 or higher.

=head2 INTRODUCTION

Multicasting is designed for streaming multimedia applications and for
conferencing systems in which one transmitting machines needs to
distribute data to a large number of clients.

IP addresses in the range 224.0.0.0 and 239.255.255.255 are reserved
for multicasting.  These addresses do not correspond to individual

 view all matches for this distribution


IO-Socket-Multicast6

 view release on metacpan or  search on metacpan

lib/IO/Socket/Multicast6.pm  view on Meta::CPAN

Your operating system must have IPv6 and Multicast support.


=head2 INTRODUCTION

Multicasting is designed for streaming multimedia applications and for
conferencing systems in which one transmitting machines needs to
distribute data to a large number of clients.

IPv4 addresses in the range 224.0.0.0 and 239.255.255.255 are reserved
for multicasting.  IPv6 multicast addresses start with the prefix FF.

 view all matches for this distribution


ISAL-Crypto

 view release on metacpan or  search on metacpan

isa-l_crypto/include/aes_gcm.h  view on Meta::CPAN

 * selected by defining the compile time option NT_LDST. The use of this option
 * places the following restriction on the gcm encryption functions:
 *
 * - The plaintext and cyphertext buffers must be aligned on a 16 byte boundary.
 *
 * - When using the streaming API, all partial input buffers must be a multiple
 *   of 16 bytes long except for the last input buffer.
 *
 * - In-place encryption/decryption is not recommended.
 *
 */

 view all matches for this distribution



Image-ExifTool

 view release on metacpan or  search on metacpan

lib/Image/ExifTool/Lang/fr.pm  view on Meta::CPAN

        '3GPP Media (.3GP) Release 1 (probably non-existent)' => '3GPP Media (.3GP) Version 1 (probablement inexistante)',
        '3GPP Media (.3GP) Release 2 (probably non-existent)' => '3GPP Media (.3GP) Version 2 (probablement inexistante)',
        '3GPP Media (.3GP) Release 3 (probably non-existent)' => '3GPP Media (.3GP) Version 3 (probablement inexistante)',
        '3GPP Media (.3GP) Release 4' => '3GPP Media (.3GP) Version 4',
        '3GPP Media (.3GP) Release 5' => '3GPP Media (.3GP) Version 5',
        '3GPP Media (.3GP) Release 6 Streaming Servers' => '3GPP Media (.3GP) Version 6 Serveurs de streaming',
        '3GPP Media (.3GP) Release 7 Streaming Servers' => '3GPP Media (.3GP) Version 7 Serveurs de streaming',
        '3GPP Release 6 General Profile' => '3GPP Version 6 Profil général',
        '3GPP2 EZmovie for KDDI 3G cellphones' => '3GPP2 EZmovie pour les téléphones portables 3G de KDDI',
        '3GPP2 Media (.3G2) compliant with 3GPP2 C.S0050-0 V1.0' => '3GPP2 Media (.3G2)conforme à 3GPP2 C.S0050-0 V1.0',
        '3GPP2 Media (.3G2) compliant with 3GPP2 C.S0050-A V1.0.0' => '3GPP2 Media (.3G2) conforme à 3GPP2 C.S0050-A V1.0.0',
        '3GPP2 Media (.3G2) compliant with 3GPP2 C.S0050-B v1.0' => '3GPP2 Media (.3G2) conforme à 3GPP2 C.S0050-B v1.0',

 view all matches for this distribution


Image-Leptonica

 view release on metacpan or  search on metacpan

lib/Image/Leptonica/Func/correlscore.pm  view on Meta::CPAN

      pixt = pixCreateTemplate(pix1);
      pixRasterop(pixt, idelx, idely, wt, ht, PIX_SRC, pix2, 0, 0);
      pixRasterop(pixt, 0, 0, wi, hi, PIX_SRC & PIX_DST, pix1, 0, 0);
      pixCountPixels(pixt, &count, tab);
      pixDestroy(&pixt);
  However, here it is done in a streaming fashion, counting as it goes,
  and touching memory exactly once, giving a 3-4x speedup over the
  simple implementation.  This very fast correlation matcher was
  contributed by William Rucklidge.

=head2 pixCorrelationScoreShifted

 view all matches for this distribution


Image-PNG-Simple

 view release on metacpan or  search on metacpan

libpng-1.6.17/contrib/gregbook/README  view on Meta::CPAN


Chapters 13, 14 and 15 of "PNG: The Definitive Guide" discuss three free,
cross-platform demo programs that show how to use the libpng reference
library:  rpng, rpng2 and wpng.  rpng and rpng2 are viewers; the first is
a very simple example that that shows how a standard file-viewer might use
libpng, while the second is designed to process streaming data and shows
how a web browser might be written.  wpng is a simple command-line program
that reads binary PGM and PPM files (the ``raw'' grayscale and RGB subsets
of PBMPLUS/NetPBM) and converts them to PNG.

The source code for all three demo programs currently compiles under

 view all matches for this distribution


Interchange-Search-Solr

 view release on metacpan or  search on metacpan

Changes  view on Meta::CPAN


     All changes done by Stefan Hornburg (Racke).

     [ENHANCEMENTS]

     * Rewrite maintainer_update method to get rid of streaming.
     * Add methods for index handlers (add, delete, commit).
     * Add num_docs method.
     * Add success, as_string and exception_message methods to Response class.

0.14 Fri Sep 21 13:21:50 2018 CEST

 view all matches for this distribution


Iodef-Pb-Simple

 view release on metacpan or  search on metacpan

lib/Iodef/Pb.pm  view on Meta::CPAN

               ['NodeRole_category_name', 13],
               ['NodeRole_category_p2p', 14],
               ['NodeRole_category_print', 15],
               ['NodeRole_category_server_internal', 16],
               ['NodeRole_category_server_public', 17],
               ['NodeRole_category_streaming', 18],
               ['NodeRole_category_voice', 19],
               ['NodeRole_category_www', 20],

            ]
        );

 view all matches for this distribution


JSON-Builder

 view release on metacpan or  search on metacpan

JSON/Builder.pm  view on Meta::CPAN


=head1 MOTIVATION

Task: to create JSON while having the memory limitations.

If you have only one large value in JSON, or, large values are created one by one, you can use the streaming generator. Otherwise, you should use such a perl structure where large elements are the filehandle with the json fragments. When a perl struc...

=head1 DESCRIPTION

=head2 JSON::Builder

 view all matches for this distribution


JSON-Feed

 view release on metacpan or  search on metacpan

t/data/daringfireball.json  view on Meta::CPAN

         "url" : "https://daringfireball.net/linked/2019/03/05/nypost-cook-nice",
         "external_url" : "https://nypost.com/2019/03/03/appless-hollywood-venture-marred-by-intrusive-execs/",
         "author" : {
            "name" : "John Gruber"
         },
         "content_html" : "\n<p>Alexandra Steigrad and Nicolas Vega, writing for The New York Post:</p>\n\n<blockquote>\n  <p>Shortly after Apple announced its Hollywood ambitions in 2017,\nTinseltown’s wheeler-dealers were lining up to work with t...
      },
      {
         "title" : "Trump Vows ‘A-Plus Treatment’ for Alabama",
         "date_published" : "2019-03-06T00:09:41Z",
         "date_modified" : "2019-03-06T19:59:09Z",

 view all matches for this distribution


JSON-Immutable-XS

 view release on metacpan or  search on metacpan

external/rapidjson/encodings.h  view on Meta::CPAN

    http://tools.ietf.org/html/rfc2781
    \tparam CharType Type for storing 16-bit UTF-16 data. Default is wchar_t. C++11 may use char16_t instead.
    \note implements Encoding concept

    \note For in-memory access, no need to concern endianness. The code units and code points are represented by CPU's endianness.
    For streaming, use UTF16LE and UTF16BE, which handle endianness.
*/
template<typename CharType = wchar_t>
struct UTF16 {
    typedef CharType Ch;
    RAPIDJSON_STATIC_ASSERT(sizeof(Ch) >= 2);

external/rapidjson/encodings.h  view on Meta::CPAN

/*! http://en.wikipedia.org/wiki/UTF-32
    \tparam CharType Type for storing 32-bit UTF-32 data. Default is unsigned. C++11 may use char32_t instead.
    \note implements Encoding concept

    \note For in-memory access, no need to concern endianness. The code units and code points are represented by CPU's endianness.
    For streaming, use UTF32LE and UTF32BE, which handle endianness.
*/
template<typename CharType = unsigned>
struct UTF32 {
    typedef CharType Ch;
    RAPIDJSON_STATIC_ASSERT(sizeof(Ch) >= 4);

 view all matches for this distribution


JSON-SIMD

 view release on metacpan or  search on metacpan

simdjson.cpp  view on Meta::CPAN

template<size_t STEP_SIZE>
error_code json_structural_indexer::index(const uint8_t *buf, size_t len, dom_parser_implementation &parser, stage1_mode partial) noexcept {
  if (simdjson_unlikely(len > parser.capacity())) { return CAPACITY; }
  // We guard the rest of the code so that we can assume that len > 0 throughout.
  if (len == 0) { return EMPTY; }
  if (is_streaming(partial)) {
    len = trim_partial_utf8(buf, len);
    // If you end up with an empty window after trimming
    // the partial UTF-8 bytes, then chances are good that you
    // have an UTF-8 formatting error.
    if(len == 0) { return UTF8_ERROR; }

simdjson.cpp  view on Meta::CPAN

  // Write out the final iteration's structurals
  indexer.write(uint32_t(idx-64), prev_structurals);
  error_code error = scanner.finish();
  // We deliberately break down the next expression so that it is
  // human readable.
  const bool should_we_exit = is_streaming(partial) ?
    ((error != SUCCESS) && (error != UNCLOSED_STRING)) // when partial we tolerate UNCLOSED_STRING
    : (error != SUCCESS); // if partial is false, we must have SUCCESS
  const bool have_unclosed_string = (error == UNCLOSED_STRING);
  if (simdjson_unlikely(should_we_exit)) { return error; }

simdjson.cpp  view on Meta::CPAN

   * ][[ which is invalid.
   *
   * This is illustrated with the test array_iterate_unclosed_error() on the following input:
   * R"({ "a": [,,)"
   **/
  parser.structural_indexes[parser.n_structural_indexes] = uint32_t(len); // used later in partial == stage1_mode::streaming_final
  parser.structural_indexes[parser.n_structural_indexes + 1] = uint32_t(len);
  parser.structural_indexes[parser.n_structural_indexes + 2] = 0;
  parser.next_structural_index = 0;
  // a valid JSON file cannot have zero structural indexes - we should have found something
  if (simdjson_unlikely(parser.n_structural_indexes == 0u)) {
    return EMPTY;
  }
  if (simdjson_unlikely(parser.structural_indexes[parser.n_structural_indexes - 1] > len)) {
    return UNEXPECTED_ERROR;
  }
  if (partial == stage1_mode::streaming_partial) {
    // If we have an unclosed string, then the last structural
    // will be the quote and we want to make sure to omit it.
    if(have_unclosed_string) {
      parser.n_structural_indexes--;
      // a valid JSON file cannot have zero structural indexes - we should have found something

simdjson.cpp  view on Meta::CPAN

        return EMPTY;
      }
    }

    parser.n_structural_indexes = new_structural_indexes;
  } else if (partial == stage1_mode::streaming_final) {
    if(have_unclosed_string) { parser.n_structural_indexes--; }
    // We truncate the input to the end of the last complete document (or zero).
    // Because partial == stage1_mode::streaming_final, it means that we may
    // silently ignore trailing garbage. Though it sounds bad, we do it
    // deliberately because many people who have streams of JSON documents
    // will truncate them for processing. E.g., imagine that you are uncompressing
    // the data from a size file or receiving it in chunks from the network. You
    // may not know where exactly the last document will be. Meanwhile the

simdjson.cpp  view on Meta::CPAN


simdjson_warn_unused error_code implementation::minify(const uint8_t *buf, size_t len, uint8_t *dst, size_t &dst_len) const noexcept {
  return arm64::stage1::json_minifier::minify<64>(buf, len, dst, dst_len);
}

simdjson_warn_unused error_code dom_parser_implementation::stage1(const uint8_t *_buf, size_t _len, stage1_mode streaming) noexcept {
  this->buf = _buf;
  this->len = _len;
  return arm64::stage1::json_structural_indexer::index<64>(buf, len, *this, streaming);
}

simdjson_warn_unused bool implementation::validate_utf8(const char *buf, size_t len) const noexcept {
  return arm64::stage1::generic_validate_utf8(buf,len);
}

simdjson.cpp  view on Meta::CPAN


  // 2-byte
  if ((buf[idx] & 0x20) == 0) {
    // missing continuation
    if (simdjson_unlikely(idx+1 > len || !is_continuation(buf[idx+1]))) {
      if (idx+1 > len && is_streaming(partial)) { idx = len; return; }
      error = UTF8_ERROR;
      idx++;
      return;
    }
    // overlong: 1100000_ 10______

simdjson.cpp  view on Meta::CPAN


  // 3-byte
  if ((buf[idx] & 0x10) == 0) {
    // missing continuation
    if (simdjson_unlikely(idx+2 > len || !is_continuation(buf[idx+1]) || !is_continuation(buf[idx+2]))) {
      if (idx+2 > len && is_streaming(partial)) { idx = len; return; }
      error = UTF8_ERROR;
      idx++;
      return;
    }
    // overlong: 11100000 100_____ ________

simdjson.cpp  view on Meta::CPAN

  }

  // 4-byte
  // missing continuation
  if (simdjson_unlikely(idx+3 > len || !is_continuation(buf[idx+1]) || !is_continuation(buf[idx+2]) || !is_continuation(buf[idx+3]))) {
    if (idx+2 > len && is_streaming(partial)) { idx = len; return; }
    error = UTF8_ERROR;
    idx++;
    return;
  }
  // overlong: 11110000 1000____ ________ ________

simdjson.cpp  view on Meta::CPAN

    }
  }
  // We pad beyond.
  // https://github.com/simdjson/simdjson/issues/906
  // See json_structural_indexer.h for an explanation.
  *next_structural_index = len; // assumed later in partial == stage1_mode::streaming_final
  next_structural_index[1] = len;
  next_structural_index[2] = 0;
  parser.n_structural_indexes = uint32_t(next_structural_index - parser.structural_indexes.get());
  if (simdjson_unlikely(parser.n_structural_indexes == 0)) { return EMPTY; }
  parser.next_structural_index = 0;
  if (partial == stage1_mode::streaming_partial) {
    if(unclosed_string) {
      parser.n_structural_indexes--;
      if (simdjson_unlikely(parser.n_structural_indexes == 0)) { return CAPACITY; }
    }
    // We truncate the input to the end of the last complete document (or zero).

simdjson.cpp  view on Meta::CPAN

        parser.n_structural_indexes = 0;
        return EMPTY;
      }
    }
    parser.n_structural_indexes = new_structural_indexes;
  } else if(partial == stage1_mode::streaming_final) {
    if(unclosed_string) { parser.n_structural_indexes--; }
    // We truncate the input to the end of the last complete document (or zero).
    // Because partial == stage1_mode::streaming_final, it means that we may
    // silently ignore trailing garbage. Though it sounds bad, we do it
    // deliberately because many people who have streams of JSON documents
    // will truncate them for processing. E.g., imagine that you are uncompressing
    // the data from a size file or receiving it in chunks from the network. You
    // may not know where exactly the last document will be. Meanwhile the

simdjson.cpp  view on Meta::CPAN

template<size_t STEP_SIZE>
error_code json_structural_indexer::index(const uint8_t *buf, size_t len, dom_parser_implementation &parser, stage1_mode partial) noexcept {
  if (simdjson_unlikely(len > parser.capacity())) { return CAPACITY; }
  // We guard the rest of the code so that we can assume that len > 0 throughout.
  if (len == 0) { return EMPTY; }
  if (is_streaming(partial)) {
    len = trim_partial_utf8(buf, len);
    // If you end up with an empty window after trimming
    // the partial UTF-8 bytes, then chances are good that you
    // have an UTF-8 formatting error.
    if(len == 0) { return UTF8_ERROR; }

simdjson.cpp  view on Meta::CPAN

  // Write out the final iteration's structurals
  indexer.write(uint32_t(idx-64), prev_structurals);
  error_code error = scanner.finish();
  // We deliberately break down the next expression so that it is
  // human readable.
  const bool should_we_exit = is_streaming(partial) ?
    ((error != SUCCESS) && (error != UNCLOSED_STRING)) // when partial we tolerate UNCLOSED_STRING
    : (error != SUCCESS); // if partial is false, we must have SUCCESS
  const bool have_unclosed_string = (error == UNCLOSED_STRING);
  if (simdjson_unlikely(should_we_exit)) { return error; }

simdjson.cpp  view on Meta::CPAN

   * ][[ which is invalid.
   *
   * This is illustrated with the test array_iterate_unclosed_error() on the following input:
   * R"({ "a": [,,)"
   **/
  parser.structural_indexes[parser.n_structural_indexes] = uint32_t(len); // used later in partial == stage1_mode::streaming_final
  parser.structural_indexes[parser.n_structural_indexes + 1] = uint32_t(len);
  parser.structural_indexes[parser.n_structural_indexes + 2] = 0;
  parser.next_structural_index = 0;
  // a valid JSON file cannot have zero structural indexes - we should have found something
  if (simdjson_unlikely(parser.n_structural_indexes == 0u)) {
    return EMPTY;
  }
  if (simdjson_unlikely(parser.structural_indexes[parser.n_structural_indexes - 1] > len)) {
    return UNEXPECTED_ERROR;
  }
  if (partial == stage1_mode::streaming_partial) {
    // If we have an unclosed string, then the last structural
    // will be the quote and we want to make sure to omit it.
    if(have_unclosed_string) {
      parser.n_structural_indexes--;
      // a valid JSON file cannot have zero structural indexes - we should have found something

simdjson.cpp  view on Meta::CPAN

        return EMPTY;
      }
    }

    parser.n_structural_indexes = new_structural_indexes;
  } else if (partial == stage1_mode::streaming_final) {
    if(have_unclosed_string) { parser.n_structural_indexes--; }
    // We truncate the input to the end of the last complete document (or zero).
    // Because partial == stage1_mode::streaming_final, it means that we may
    // silently ignore trailing garbage. Though it sounds bad, we do it
    // deliberately because many people who have streams of JSON documents
    // will truncate them for processing. E.g., imagine that you are uncompressing
    // the data from a size file or receiving it in chunks from the network. You
    // may not know where exactly the last document will be. Meanwhile the

simdjson.cpp  view on Meta::CPAN


simdjson_warn_unused error_code implementation::minify(const uint8_t *buf, size_t len, uint8_t *dst, size_t &dst_len) const noexcept {
  return icelake::stage1::json_minifier::minify<128>(buf, len, dst, dst_len);
}

simdjson_warn_unused error_code dom_parser_implementation::stage1(const uint8_t *_buf, size_t _len, stage1_mode streaming) noexcept {
  this->buf = _buf;
  this->len = _len;
  return icelake::stage1::json_structural_indexer::index<128>(_buf, _len, *this, streaming);
}

simdjson_warn_unused bool implementation::validate_utf8(const char *buf, size_t len) const noexcept {
  return icelake::stage1::generic_validate_utf8(buf,len);
}

simdjson.cpp  view on Meta::CPAN

template<size_t STEP_SIZE>
error_code json_structural_indexer::index(const uint8_t *buf, size_t len, dom_parser_implementation &parser, stage1_mode partial) noexcept {
  if (simdjson_unlikely(len > parser.capacity())) { return CAPACITY; }
  // We guard the rest of the code so that we can assume that len > 0 throughout.
  if (len == 0) { return EMPTY; }
  if (is_streaming(partial)) {
    len = trim_partial_utf8(buf, len);
    // If you end up with an empty window after trimming
    // the partial UTF-8 bytes, then chances are good that you
    // have an UTF-8 formatting error.
    if(len == 0) { return UTF8_ERROR; }

simdjson.cpp  view on Meta::CPAN

  // Write out the final iteration's structurals
  indexer.write(uint32_t(idx-64), prev_structurals);
  error_code error = scanner.finish();
  // We deliberately break down the next expression so that it is
  // human readable.
  const bool should_we_exit = is_streaming(partial) ?
    ((error != SUCCESS) && (error != UNCLOSED_STRING)) // when partial we tolerate UNCLOSED_STRING
    : (error != SUCCESS); // if partial is false, we must have SUCCESS
  const bool have_unclosed_string = (error == UNCLOSED_STRING);
  if (simdjson_unlikely(should_we_exit)) { return error; }

simdjson.cpp  view on Meta::CPAN

   * ][[ which is invalid.
   *
   * This is illustrated with the test array_iterate_unclosed_error() on the following input:
   * R"({ "a": [,,)"
   **/
  parser.structural_indexes[parser.n_structural_indexes] = uint32_t(len); // used later in partial == stage1_mode::streaming_final
  parser.structural_indexes[parser.n_structural_indexes + 1] = uint32_t(len);
  parser.structural_indexes[parser.n_structural_indexes + 2] = 0;
  parser.next_structural_index = 0;
  // a valid JSON file cannot have zero structural indexes - we should have found something
  if (simdjson_unlikely(parser.n_structural_indexes == 0u)) {
    return EMPTY;
  }
  if (simdjson_unlikely(parser.structural_indexes[parser.n_structural_indexes - 1] > len)) {
    return UNEXPECTED_ERROR;
  }
  if (partial == stage1_mode::streaming_partial) {
    // If we have an unclosed string, then the last structural
    // will be the quote and we want to make sure to omit it.
    if(have_unclosed_string) {
      parser.n_structural_indexes--;
      // a valid JSON file cannot have zero structural indexes - we should have found something

simdjson.cpp  view on Meta::CPAN

        return EMPTY;
      }
    }

    parser.n_structural_indexes = new_structural_indexes;
  } else if (partial == stage1_mode::streaming_final) {
    if(have_unclosed_string) { parser.n_structural_indexes--; }
    // We truncate the input to the end of the last complete document (or zero).
    // Because partial == stage1_mode::streaming_final, it means that we may
    // silently ignore trailing garbage. Though it sounds bad, we do it
    // deliberately because many people who have streams of JSON documents
    // will truncate them for processing. E.g., imagine that you are uncompressing
    // the data from a size file or receiving it in chunks from the network. You
    // may not know where exactly the last document will be. Meanwhile the

simdjson.cpp  view on Meta::CPAN


simdjson_warn_unused error_code implementation::minify(const uint8_t *buf, size_t len, uint8_t *dst, size_t &dst_len) const noexcept {
  return haswell::stage1::json_minifier::minify<128>(buf, len, dst, dst_len);
}

simdjson_warn_unused error_code dom_parser_implementation::stage1(const uint8_t *_buf, size_t _len, stage1_mode streaming) noexcept {
  this->buf = _buf;
  this->len = _len;
  return haswell::stage1::json_structural_indexer::index<128>(_buf, _len, *this, streaming);
}

simdjson_warn_unused bool implementation::validate_utf8(const char *buf, size_t len) const noexcept {
  return haswell::stage1::generic_validate_utf8(buf,len);
}

simdjson.cpp  view on Meta::CPAN

template<size_t STEP_SIZE>
error_code json_structural_indexer::index(const uint8_t *buf, size_t len, dom_parser_implementation &parser, stage1_mode partial) noexcept {
  if (simdjson_unlikely(len > parser.capacity())) { return CAPACITY; }
  // We guard the rest of the code so that we can assume that len > 0 throughout.
  if (len == 0) { return EMPTY; }
  if (is_streaming(partial)) {
    len = trim_partial_utf8(buf, len);
    // If you end up with an empty window after trimming
    // the partial UTF-8 bytes, then chances are good that you
    // have an UTF-8 formatting error.
    if(len == 0) { return UTF8_ERROR; }

simdjson.cpp  view on Meta::CPAN

  // Write out the final iteration's structurals
  indexer.write(uint32_t(idx-64), prev_structurals);
  error_code error = scanner.finish();
  // We deliberately break down the next expression so that it is
  // human readable.
  const bool should_we_exit = is_streaming(partial) ?
    ((error != SUCCESS) && (error != UNCLOSED_STRING)) // when partial we tolerate UNCLOSED_STRING
    : (error != SUCCESS); // if partial is false, we must have SUCCESS
  const bool have_unclosed_string = (error == UNCLOSED_STRING);
  if (simdjson_unlikely(should_we_exit)) { return error; }

simdjson.cpp  view on Meta::CPAN

   * ][[ which is invalid.
   *
   * This is illustrated with the test array_iterate_unclosed_error() on the following input:
   * R"({ "a": [,,)"
   **/
  parser.structural_indexes[parser.n_structural_indexes] = uint32_t(len); // used later in partial == stage1_mode::streaming_final
  parser.structural_indexes[parser.n_structural_indexes + 1] = uint32_t(len);
  parser.structural_indexes[parser.n_structural_indexes + 2] = 0;
  parser.next_structural_index = 0;
  // a valid JSON file cannot have zero structural indexes - we should have found something
  if (simdjson_unlikely(parser.n_structural_indexes == 0u)) {
    return EMPTY;
  }
  if (simdjson_unlikely(parser.structural_indexes[parser.n_structural_indexes - 1] > len)) {
    return UNEXPECTED_ERROR;
  }
  if (partial == stage1_mode::streaming_partial) {
    // If we have an unclosed string, then the last structural
    // will be the quote and we want to make sure to omit it.
    if(have_unclosed_string) {
      parser.n_structural_indexes--;
      // a valid JSON file cannot have zero structural indexes - we should have found something

simdjson.cpp  view on Meta::CPAN

        return EMPTY;
      }
    }

    parser.n_structural_indexes = new_structural_indexes;
  } else if (partial == stage1_mode::streaming_final) {
    if(have_unclosed_string) { parser.n_structural_indexes--; }
    // We truncate the input to the end of the last complete document (or zero).
    // Because partial == stage1_mode::streaming_final, it means that we may
    // silently ignore trailing garbage. Though it sounds bad, we do it
    // deliberately because many people who have streams of JSON documents
    // will truncate them for processing. E.g., imagine that you are uncompressing
    // the data from a size file or receiving it in chunks from the network. You
    // may not know where exactly the last document will be. Meanwhile the

simdjson.cpp  view on Meta::CPAN


simdjson_warn_unused error_code implementation::minify(const uint8_t *buf, size_t len, uint8_t *dst, size_t &dst_len) const noexcept {
  return ppc64::stage1::json_minifier::minify<64>(buf, len, dst, dst_len);
}

simdjson_warn_unused error_code dom_parser_implementation::stage1(const uint8_t *_buf, size_t _len, stage1_mode streaming) noexcept {
  this->buf = _buf;
  this->len = _len;
  return ppc64::stage1::json_structural_indexer::index<64>(buf, len, *this, streaming);
}

simdjson_warn_unused bool implementation::validate_utf8(const char *buf, size_t len) const noexcept {
  return ppc64::stage1::generic_validate_utf8(buf,len);
}

simdjson.cpp  view on Meta::CPAN

template<size_t STEP_SIZE>
error_code json_structural_indexer::index(const uint8_t *buf, size_t len, dom_parser_implementation &parser, stage1_mode partial) noexcept {
  if (simdjson_unlikely(len > parser.capacity())) { return CAPACITY; }
  // We guard the rest of the code so that we can assume that len > 0 throughout.
  if (len == 0) { return EMPTY; }
  if (is_streaming(partial)) {
    len = trim_partial_utf8(buf, len);
    // If you end up with an empty window after trimming
    // the partial UTF-8 bytes, then chances are good that you
    // have an UTF-8 formatting error.
    if(len == 0) { return UTF8_ERROR; }

simdjson.cpp  view on Meta::CPAN

  // Write out the final iteration's structurals
  indexer.write(uint32_t(idx-64), prev_structurals);
  error_code error = scanner.finish();
  // We deliberately break down the next expression so that it is
  // human readable.
  const bool should_we_exit = is_streaming(partial) ?
    ((error != SUCCESS) && (error != UNCLOSED_STRING)) // when partial we tolerate UNCLOSED_STRING
    : (error != SUCCESS); // if partial is false, we must have SUCCESS
  const bool have_unclosed_string = (error == UNCLOSED_STRING);
  if (simdjson_unlikely(should_we_exit)) { return error; }

simdjson.cpp  view on Meta::CPAN

   * ][[ which is invalid.
   *
   * This is illustrated with the test array_iterate_unclosed_error() on the following input:
   * R"({ "a": [,,)"
   **/
  parser.structural_indexes[parser.n_structural_indexes] = uint32_t(len); // used later in partial == stage1_mode::streaming_final
  parser.structural_indexes[parser.n_structural_indexes + 1] = uint32_t(len);
  parser.structural_indexes[parser.n_structural_indexes + 2] = 0;
  parser.next_structural_index = 0;
  // a valid JSON file cannot have zero structural indexes - we should have found something
  if (simdjson_unlikely(parser.n_structural_indexes == 0u)) {
    return EMPTY;
  }
  if (simdjson_unlikely(parser.structural_indexes[parser.n_structural_indexes - 1] > len)) {
    return UNEXPECTED_ERROR;
  }
  if (partial == stage1_mode::streaming_partial) {
    // If we have an unclosed string, then the last structural
    // will be the quote and we want to make sure to omit it.
    if(have_unclosed_string) {
      parser.n_structural_indexes--;
      // a valid JSON file cannot have zero structural indexes - we should have found something

simdjson.cpp  view on Meta::CPAN

        return EMPTY;
      }
    }

    parser.n_structural_indexes = new_structural_indexes;
  } else if (partial == stage1_mode::streaming_final) {
    if(have_unclosed_string) { parser.n_structural_indexes--; }
    // We truncate the input to the end of the last complete document (or zero).
    // Because partial == stage1_mode::streaming_final, it means that we may
    // silently ignore trailing garbage. Though it sounds bad, we do it
    // deliberately because many people who have streams of JSON documents
    // will truncate them for processing. E.g., imagine that you are uncompressing
    // the data from a size file or receiving it in chunks from the network. You
    // may not know where exactly the last document will be. Meanwhile the

simdjson.cpp  view on Meta::CPAN


simdjson_warn_unused error_code implementation::minify(const uint8_t *buf, size_t len, uint8_t *dst, size_t &dst_len) const noexcept {
  return westmere::stage1::json_minifier::minify<64>(buf, len, dst, dst_len);
}

simdjson_warn_unused error_code dom_parser_implementation::stage1(const uint8_t *_buf, size_t _len, stage1_mode streaming) noexcept {
  this->buf = _buf;
  this->len = _len;
  return westmere::stage1::json_structural_indexer::index<64>(_buf, _len, *this, streaming);
}

simdjson_warn_unused bool implementation::validate_utf8(const char *buf, size_t len) const noexcept {
  return westmere::stage1::generic_validate_utf8(buf,len);
}

 view all matches for this distribution


JSON-SL

 view release on metacpan or  search on metacpan

lib/JSON/SL.pm  view on Meta::CPAN

=head2 DESCRIPTION

JSON::SL was designed from the ground up to be easily accessible and
searchable for partially received streamining content.

It uses an embedded C library (C<jsonsl>) to do the streaming and most
of the dirty work.

JSON::SL allows you to use the
L<JSONPointer|http://tools.ietf.org/html/draft-pbryan-zyp-json-pointer-02>
URI/path syntax to tell it about certain objects and elements which are of

 view all matches for this distribution


JSON-Streaming-Reader

 view release on metacpan or  search on metacpan

lib/JSON/Streaming/Reader.pm  view on Meta::CPAN


=head1 NAME

JSON::Streaming::Reader - Read JSON strings in a streaming manner

=cut

package JSON::Streaming::Reader;

lib/JSON/Streaming/Reader.pm  view on Meta::CPAN

popular Perl JSON libraries: objects become hashrefs, arrays become arrayrefs,
strings and integers become scalars, boolean values become references to either
1 or 0, and null becomes undef.

This is useful if there is a part of the tree that you would rather handle
via an in-memory data structure like you'd get from a non-streaming JSON parser.
It allows you to mix-and-match streaming parsing and one-shot parsing
within a single data stream.

Note that errors encountered during skip are actually raised via C<die> rather than
via the return value as with C<get_token>.

lib/JSON/Streaming/Reader.pm  view on Meta::CPAN

C<end_object> or C<end_array>.

=head1 EVENT-BASED API

This module has an experimental event-based API which can be used to
do streaming JSON processing in event-driven applications or those
which do non-blocking I/O.

In event-based mode it is the caller's responsibility to obtain data and
when data is available provide it to the reader for processing. When
enough data is available to unambigously represent a complete, atomic token

lib/JSON/Streaming/Reader.pm  view on Meta::CPAN

There are two major classes of token types. Bracketing tokens enclose other tokens
and come in pairs, named with C<start_> and C<end_> prefixes. Leaf tokens stand alone
and have C<add_> prefixes.

For convenience the token type names match the method names used in the "raw" API
of L<JSON::Streaming::Writer>, so it is straightforward to implement a streaming JSON
normalizer by feeding the output from this module into the corresponding methods on that module.
However, this module does have an additional special token type 'error' which is used
to indicate tokenizing errors and does not have a corresponding method on the writer.

=head2 start_object, end_object

 view all matches for this distribution


JSON-Streaming-Writer

 view release on metacpan or  search on metacpan

lib/JSON/Streaming/Writer.pm  view on Meta::CPAN


1;

=head1 NAME

JSON::Streaming::Writer - Generate JSON output in a streaming manner

=head1 SYNOPSIS

    my $jsonw = JSON::Streaming::Writer->for_stream($fh)
    $jsonw->start_object();

lib/JSON/Streaming/Writer.pm  view on Meta::CPAN


This library allows you to generate syntactically-correct JSON without
first assembling your complete data structure in memory. This allows
large structures to be returned without requiring those
structures to be memory-resident, and also allows parts of the output
to be made available to a streaming-capable JSON parser while
the rest of the output is being generated, which may improve
performance of JSON-based network protocols.

=head1 RAW API

lib/JSON/Streaming/Writer.pm  view on Meta::CPAN


Produces a JSON value representing the given Perl value. This library can handle
Perl strings, integers (i.e. scalars that have most recently been used as numbers),
references to the values 0 and 1 representing booleans and C<undef> representing
a JSON C<null>. It can also accept ARRAY and HASH refs that contain such values
and produce JSON array and object values recursively, much like a non-streaming
JSON producer library would do.

This method is a wrapper around the corresponding raw API calls, so the error
messages it generates will often refer to the underlying raw API.

 view all matches for this distribution


Jabber-Connection

 view release on metacpan or  search on metacpan

README  view on Meta::CPAN


The package contains three modules:

- Jabber::Connection

  Handles connectivity, authentication, XML streaming and
  callbacks

- Jabber::NodeFactory

  Enables creation and manipulation of Jabber packets

 view all matches for this distribution


JavaScript-Duktape-XS

 view release on metacpan or  search on metacpan

duktape.c  view on Meta::CPAN

 *  and a simple copying one.
 *
 *  Decoding directly from the source string would be another lexing option.
 *  But the lookup window based approach has the advantage of hiding the
 *  source string and its encoding effectively which gives more flexibility
 *  going forward to e.g. support chunked streaming of source from flash.
 *
 *  Decodes UTF-8/CESU-8 leniently with support for code points from U+0000 to
 *  U+10FFFF, causing an error if the input is unparseable.  Leniency means:
 *
 *    * Unicode code point validation is intentionally not performed,

duktape.c  view on Meta::CPAN

 *      byte resulted in a code increase though.
 *
 *    * Is checking against maximum 0x10ffff really useful?  4-byte encoding
 *      imposes a certain limit anyway.
 *
 *    * Support chunked streaming of source code.  Can be implemented either
 *      by streaming chunks of bytes or chunks of codepoints.
 */

#if defined(DUK_USE_LEXER_SLIDING_WINDOW)
DUK_LOCAL void duk__fill_lexer_buffer(duk_lexer_ctx *lex_ctx, duk_small_uint_t start_offset_bytes) {
	duk_lexer_codepoint *cp, *cp_end;

 view all matches for this distribution


JavaScript-Duktape

 view release on metacpan or  search on metacpan

lib/JavaScript/Duktape/C/lib/duktape.c  view on Meta::CPAN

 *  and a simple copying one.
 *
 *  Decoding directly from the source string would be another lexing option.
 *  But the lookup window based approach has the advantage of hiding the
 *  source string and its encoding effectively which gives more flexibility
 *  going forward to e.g. support chunked streaming of source from flash.
 *
 *  Decodes UTF-8/CESU-8 leniently with support for code points from U+0000 to
 *  U+10FFFF, causing an error if the input is unparseable.  Leniency means:
 *
 *    * Unicode code point validation is intentionally not performed,

lib/JavaScript/Duktape/C/lib/duktape.c  view on Meta::CPAN

 *      byte resulted in a code increase though.
 *
 *    * Is checking against maximum 0x10ffff really useful?  4-byte encoding
 *      imposes a certain limit anyway.
 *
 *    * Support chunked streaming of source code.  Can be implemented either
 *      by streaming chunks of bytes or chunks of codepoints.
 */

#if defined(DUK_USE_LEXER_SLIDING_WINDOW)
DUK_LOCAL void duk__fill_lexer_buffer(duk_lexer_ctx *lex_ctx, duk_small_uint_t start_offset_bytes) {
	duk_lexer_codepoint *cp, *cp_end;

 view all matches for this distribution


JavaScript-Embedded

 view release on metacpan or  search on metacpan

lib/JavaScript/Embedded/C/lib/duktape.c  view on Meta::CPAN

 *  and a simple copying one.
 *
 *  Decoding directly from the source string would be another lexing option.
 *  But the lookup window based approach has the advantage of hiding the
 *  source string and its encoding effectively which gives more flexibility
 *  going forward to e.g. support chunked streaming of source from flash.
 *
 *  Decodes UTF-8/CESU-8 leniently with support for code points from U+0000 to
 *  U+10FFFF, causing an error if the input is unparseable.  Leniency means:
 *
 *    * Unicode code point validation is intentionally not performed,

lib/JavaScript/Embedded/C/lib/duktape.c  view on Meta::CPAN

 *      byte resulted in a code increase though.
 *
 *    * Is checking against maximum 0x10ffff really useful?  4-byte encoding
 *      imposes a certain limit anyway.
 *
 *    * Support chunked streaming of source code.  Can be implemented either
 *      by streaming chunks of bytes or chunks of codepoints.
 */

#if defined(DUK_USE_LEXER_SLIDING_WINDOW)
DUK_LOCAL void duk__fill_lexer_buffer(duk_lexer_ctx *lex_ctx, duk_small_uint_t start_offset_bytes) {
	duk_lexer_codepoint *cp, *cp_end;

 view all matches for this distribution


Jifty

 view release on metacpan or  search on metacpan

lib/Jifty/Handler.pm  view on Meta::CPAN

    }

    $static->add( Plack::App::File->new
            ( root => Jifty->config->framework('Web')->{DefaultStaticRoot} )->to_app );

    # the buffering and unsetting of psgi.streaming is to vivify the
    # responded res from the $static cascade app.
    builder {
        enable 'Plack::Middleware::ConditionalGET';
        enable
            sub { my $app = shift;
                  sub { my $env = shift;
                        $env->{'psgi.streaming'} = 0;
                        my $res = $app->($env);
                        # skip streamy response
                        return $res unless ref($res) eq 'ARRAY' && $res->[2];
                        my $h = Plack::Util::headers($res->[1]);;
                        $h->set( 'Cache-Control' => 'max-age=31536000, public' );

 view all matches for this distribution


Kamaitachi

 view release on metacpan or  search on metacpan

lib/Kamaitachi/Service/StreamAudienceCounter.pm  view on Meta::CPAN


=encoding utf8

=head1 NAME

Kamaitachi::Service::StreamAudienceCounter - service role to count and broadcast streaming audience

=head1 SYNOPSIS

=head1 DESCRIPTION

 view all matches for this distribution


Kelp

 view release on metacpan or  search on metacpan

lib/Kelp/Response.pm  view on Meta::CPAN

=head2 partial

Sets partial response. If this attribute is set to a true value, it will cause
C<finalize> to return the HTTP status code and headers, but not the body. This is
convenient if you intend to stream your content. In the following example, we
set C<partial> to 1 and use C<finalize> to get a C<writer> object for streaming.

    sub stream {
        my $self = shift;
        return sub {
            my $responder = shift;

 view all matches for this distribution


KiokuDB

 view release on metacpan or  search on metacpan

lib/KiokuDB/Backend/Serialize/Storable.pm  view on Meta::CPAN

    with qw(KiokuDB::Backend::Serialize::Storable;

=head1 DESCRIPTION

This role provides L<Storable> based serialization of L<KiokuDB::Entry> objects
for a backend, with streaming capabilities.

L<KiokuDB::Backend::Serialize::Delegate> is preferred to using this directly.

=head1 METHODS

 view all matches for this distribution


LWP-MediaTypes

 view release on metacpan or  search on metacpan

lib/LWP/media.types  view on Meta::CPAN

application/vnd.oasis.opendocument.text-master		odm
application/vnd.oasis.opendocument.text-template	ott
application/vnd.oasis.opendocument.text-web		oth
# application/vnd.obn
# application/vnd.oipf.contentaccessdownload+xml
# application/vnd.oipf.contentaccessstreaming+xml
# application/vnd.oipf.cspg-hexbinary
# application/vnd.oipf.dae.svg+xml
# application/vnd.oipf.dae.xhtml+xml
# application/vnd.oipf.mippvcontrolmessage+xml
# application/vnd.oipf.pae.gem

 view all matches for this distribution


LaBrea-Tarpit

 view release on metacpan or  search on metacpan

Report/examples/localTrojans.pl  view on Meta::CPAN

644	tcp/udp	dwr	dwr
645	tcp/udp	pssc	PSSC
646	tcp/udp	ldp	LDP
647	tcp/udp	dhcp-failover	DHCP Failover
648	tcp/udp	rrp	Registry Registrar Protocol (RRP)
649	tcp/udp	cadview-3d	Cadview-3d - streaming 3d models over the internet
650	tcp/udp	obex	OBEX
651	tcp/udp	ieee-mms	IEEE MMS
652	tcp/udp	hello-port	HELLO_PORT
653	tcp/udp	repscmd	RepCmd
654	tcp/udp	aodv	AODV

Report/examples/localTrojans.pl  view on Meta::CPAN

1750	tcp/udp	sslp	Simple Socket Library's PortMaster
1751	tcp/udp	swiftnet	SwiftNet
1752	tcp/udp	lofr-lm	Leap of Faith Research License Manager
1753	tcp/udp	#	Unassigned
1754	tcp/udp	oracle-em2	oracle-em2
1755	tcp/udp	ms-streaming	ms-streaming
1756	tcp/udp	capfast-lmd	capfast-lmd
1757	tcp/udp	cnhrp	cnhrp
1758	tcp/udp	tftp-mcast	tftp-mcast
1759	tcp/udp	spss-lm	SPSS License Manager
1760	tcp/udp	www-ldap-gw	www-ldap-gw

 view all matches for this distribution


Lab-Measurement

 view release on metacpan or  search on metacpan

lib/Lab/Moose/Instrument/ZI_HDAWG.pm  view on Meta::CPAN

Type: Double(D)
Unit: Mbit/s

 get_stats_cmdstream_bandwidth()

Command streaming bandwidth usage on the physical network connection between device and
data server.

=head3 /DEV/STATS/CMDSTREAM/BYTESRECEIVED 
Properties: Read 
Type: Integer (64 bit)(I)

lib/Lab/Moose/Instrument/ZI_HDAWG.pm  view on Meta::CPAN

Type: Double(D)
Unit: Mbit/s

 get_stats_datastream_bandwidth()

Data streaming bandwidth usage on the physical network connection between device and data
server.

=head3 /DEV/STATS/DATASTREAM/BYTESRECEIVED 
Properties: Read 
Type: Integer (64 bit)(I)

lib/Lab/Moose/Instrument/ZI_HDAWG.pm  view on Meta::CPAN

Unit: None

 set_triggers_streams_enable(stream => $stream, value => $value)
 get_triggers_streams_enable(stream => $stream)

Enables trigger streaming.

=head3 /DEV/TRIGGERS/STREAMS/n/HOLDOFFTIME 
Properties: Read Write Setting 
Type: Double(D)
Unit: s

 view all matches for this distribution


( run in 0.572 second using v1.01-cache-2.11-cpan-a5abf4f5562 )