EntityScript

Draft 1.01:
Index







Sections: Status -> Approved

1.) EntityScript™ Overview - Overview of the project

2.) .entity/.ds Lookups - Overview of basic .entity and .ds file relationships. An overview of each file.

3.) Special File Types - Overview of special types of methods, files, and types in C.ORE™

4.) Log File Types - Overview of Logging within C.ORE™

5.) Using OpenPackager™ - Overview of the OpenPackager™ project and details regarding using the sub-system for organizing the base system and addons.

6.) Locations - Overview of various parts of the IPDVC™ file system

7.) System Audits - Overview of various audit mechanisms present in C.ORE™ and how to use them.

8.) C.ORE™ Scripts - Overview of scripts that run in C.ORE™, the purpose they have, overview of some scripting philosophy, and some usage examples.

9.) Startup - Step-by-step process of startup routines and how to extend them

10.) C.ORE™: AirEP™ file system (Access Identity Ring Entertainment Platform: AirEP™ - Understand more about the Access, Identity, Ring

11.) Keys: Each of the built-in keys that can be used to retrieve meta-information in C.ORE™

12.) How-to: Additional Notes

13.) Moderation: Basic moderation principles

14.) Social Pledge: Basic social-pledge for your interface and who you interact with

15.) Disclaimers: Basic recommendations for your interface and dealing with various types of commerce

16.) Additional Resources: Dig deeper into the technical stack, program methods, variables, and other happenings within CORE.HOST™

core_alerts




"""
Copyright (C) 2020 New Entity Operations Inc.

ALL RIGHTS RESERVED

core_alerts is a module to establish alert markers and related behavior.

There are some basic contextual examples included. They can be expanded
on even more and become more complex in various operator-defined ways too.

Usage: To setup a LOGGER, follow the convention below
# LOGGER
# Where to place: Place the LOGGER at the end of the method or function call
# you would like to log.

# Here is an example. This will only run if the routine finishes to the point
# where the LOGGER occurs: This calls core_alerts for the specific LOGGER
# function

get_DOCUMENTS_LOGGER = "[get_DOCUMENTS_LOGGER: "
get_DOCUMENTS_helper = "open file]"
print(get_DOCUMENTS_LOGGER+get_DOCUMENTS_helper+\
ALERT_LOGGING_WARNING(
 variable=get_DOCUMENTS_LOGGER+get_DOCUMENTS_helper
 )+"\n"
)

Alerts get scaled according to threat scores and put into appropriate buckets.

These scores can be built according to your specifications.

Anything that's logged with a score also has a specified logging type too.

These are user defined with some preset options available.

When to use this script: When a method or function call requires auditing later,
log it with this methodology.

Why to use: This could help audit internal process. Maybe it's ad views or
other operator defined methodologies that would be nice to confirm are taking
place.

Extra details: The 'START: ACTION',  and 'STOP: ACTION' areas are for readability
They can break large files into sections of context that the programmer or
business team can easily inspect as a unit later on.

Areas of code that are easily tracked in operator-defined ways can help
understand 'system intent'.

Meaning, the 'START: ACTION' block goes at the beginning of the routine that
will be triggered upon whatever you're generating.
When the generated routine is fully spent, throw the 'STOP: ACTION' alert.

'STOP: ACTION' blocks can apply in tandem with other 'STOP: ACTION' blocks
but there can only be one 'START: BLOCK' containing a 'START: ACTION' for 
the first acting script and limited only by another 'START: BLOCK'.

To use an Action, put a similar convention into your code, but replace
anything contextual so it can be identified in the logs.

      EventLolli.update_ad_special__LOGGER = \
       "[Audited: EventLolli.update_ad_special: "
      EventLolli.update_ad_special__helper = \
      "check values] "
      EventLolli.update_ad_special__values = \
       "[UNIT: ms - "+STRING_INT_seconds_adjusted+"] "
      print(EventLolli.update_ad_special__LOGGER+\
       EventLolli.update_ad_special__helper+\
        ACTION_AUDITED(variable=EventLolli.update_ad_special__LOGGER+\
         EventLolli.update_ad_special__helper+\
          EventLolli.update_ad_special__values)+\
           "\n")
      print(
       "---------- STOP: ACTION -" \
       " [Audited: EventLolli.update_ad_special: check values] ----------"
      )
      #*******************************************************************#
      # STOP: ACTION -                                                    #
      # [Audited: EventLolli.update_ad_special: check values] ----------" #
      #*******************************************************************#
"""
## Allowed Hash Imports: Standard for the basic install is sha1
# sha1 isn't compliant in certain situations so you can
# upgrade according to your needs at any time
# in various situations
from hashlib import sha1
## Imports: Custom
from core_middlelayer import (
 ENTITYSLUG, DATASHEETFOLDER, EXTDS, EXTENTITY,
 DIRLOG, LOG,
 DASH_UTIL,
 splat_fake_output
)

## Imports: Log makers
from core_interface import (
 TimeStamp as TS,
)
def LogMaker_date():
 return(TS.time_stamp_time())
def LogMaker_time():
 return(TS.date_stamp_day())

## Alert Options
THREAT_SCORE_INFO = str(0)
THREAT_SCORE_WARNING = str(1)
THREAT_SCORE_FAILURE = str(2)
# Two types of 'ACTION'
# A for Action
AUDIT_TAG = 'A'
# E for Event
EVENT_TAG = 'E'
# S for Search
SEARCH_RECORDER_TAG = 'S'
# P for Parse
ENTITYSCRIPT_RECORDER_TAG = 'P'
# K for KingSlug (currently important or being worked on - the 'Active Document')
KING_SLUG_RECORDER_TAG = 'K'
def ALERT_LOGGING_INFO(variable="[DEFAULT]"):
    """
    ALERT with the 'LOGGING_INFO' option

    """
    ALERT_INFO_STANDARD = """[EVENTTIME: """+\
     LogMaker_time()+"""] ["""+\
     LogMaker_date()+"""] ["""+\
     THREAT_SCORE_INFO+"""] [ALERT: INFO SLUG] """+\
     variable+"\n"
    with open(DIRLOG+LOG, 'a') as logger_INFO:
     logger_INFO.write(ALERT_INFO_STANDARD)
    logger_INFO.close()
    return(ALERT_INFO_STANDARD)

def ALERT_LOGGING_WARNING(variable="[DEFAULT] "):
    """
    ALERT with the 'LOGGING_WARNING' option

    """
    ALERT_WARNING_STANDARD = """[EVENTTIME: """+\
     LogMaker_time()+"""] ["""+\
     LogMaker_date()+"""] ["""+\
     THREAT_SCORE_WARNING+"""] [ALERT: WARNING SLUG] """+\
     variable+"\n"
    with open(DIRLOG+LOG, 'a') as logger_WARNING:
     logger_WARNING.write(ALERT_WARNING_STANDARD)
    logger_WARNING.close()
    return(ALERT_WARNING_STANDARD)

def ALERT_LOGGING_FAILURE(variable=" [DEFAULT] "):
    """
    ALERT with the 'LOGGING_FAILURE' option

    """
    ALERT_FAILURE_STANDARD = """[EVENTTIME: """+\
     LogMaker_time()+"""] ["""+\
     LogMaker_date()+"""] ["""+\
     THREAT_SCORE_FAILURE+"""] [ALERT: FAILURE SLUG] """+\
     variable+"\n"
    with open(DIRLOG+LOG, 'a') as logger_FAILURE:
     logger_FAILURE.write(ALERT_FAILURE_STANDARD)
    logger_FAILURE.close()
    return(ALERT_FAILURE_STANDARD)

## ACTION: Custom action logging
def ACTION_REGULAR(variable=" [DEFAULT] "):
    """
    ACTION with the 'REGULAR'

    """
    ACTION_REGULAR_STANDARD = """[EVENTTIME: """+\
     LogMaker_time()+"""] ["""+\
     LogMaker_date()+"""] ["""+\
     EVENT_TAG+"""] [ACTION: REGULAR SLUG] """+\
     variable+"\n"
    with open(DIRLOG+LOG, 'a') as action_REGULAR:
     action_REGULAR.write(ACTION_REGULAR_STANDARD)
    action_REGULAR.close()
    return(ACTION_REGULAR_STANDARD)

## ACTION_AUDITED is the standard AUDIT block used
def ACTION_AUDITED(variable=" [DEFAULT] "):
    """
    ACTION with 'AUDITED'

    """
    ACTION_AUDITED_STANDARD = """[EVENTTIME: """+\
     LogMaker_time()+"""] ["""+\
     LogMaker_date()+"""] ["""+\
     AUDIT_TAG+"""] [ACTION: AUDITED SLUG]"""+\
     variable+"\n"
    with open(DIRLOG+LOG, 'a') as action_AUDITED:
     action_AUDITED.write(ACTION_AUDITED_STANDARD)
    action_AUDITED.close()
    return(ACTION_AUDITED_STANDARD)

## All of the'ACTION_X...' methods take various action types
# These action types are generated from the key-directory above
def ACTION_ENTITYSCRIPT_RECORDER(variable=" [DEFAULT] "):
    """
    ACTION with 'SEARCH_RECORDER'

    """
    ACTION_SEARCH_RECORDER_STANDARD = """[EVENTTIME: """+\
     LogMaker_time()+"""] ["""+\
     LogMaker_date()+"""] ["""+\
     ENTITYSCRIPT_RECORDER_TAG+"""] [ACTION: SEARCH_RECORDED]"""+\
     variable+"\n"
    with open(DIRLOG+LOG, 'a') as action_SEARCH_RECORDER:
     action_SEARCH_RECORDER.write(ACTION_SEARCH_RECORDER_STANDARD)
    action_SEARCH_RECORDER.close()
    return(ACTION_SEARCH_RECORDER_STANDARD)

def ACTION_KING_SLUG_RECORDER(variable=" [DEFAULT] "):
    """
    ACTION with 'KING_SLUG_RECORDER'

    """
    ACTION_KING_SLUG_RECORDER_STANDARD = """[EVENTTIME: """+\
     LogMaker_time()+"""] ["""+\
     LogMaker_date()+"""] ["""+\
     KING_SLUG_RECORDER_TAG+"""] [ACTION: KING_SLUG_RECORDED]"""+\
     variable+"\n"
    with open(DIRLOG+LOG, 'a') as action_KING_SLUG_RECORDER:
     action_KING_SLUG_RECORDER.write(ACTION_KING_SLUG_RECORDER_STANDARD)
    action_KING_SLUG_RECORDER.close()
    return(ACTION_KING_SLUG_RECORDER_STANDARD)

def ACTION_SEARCH_RECORDER(variable=" [DEFAULT] "):
    """
    ACTION with 'SEARCH_RECORDER'

    """
    ACTION_SEARCH_RECORDER_STANDARD = """[EVENTTIME: """+\
     LogMaker_time()+"""] ["""+\
     LogMaker_date()+"""] ["""+\
     SEARCH_RECORDER_TAG+"""] [ACTION: SEARCH_RECORDED]"""+\
     variable+"\n"
    with open(DIRLOG+LOG, 'a') as action_SEARCH_RECORDER:
     action_SEARCH_RECORDER.write(ACTION_SEARCH_RECORDER_STANDARD)
    action_SEARCH_RECORDER.close()
    return(ACTION_SEARCH_RECORDER_STANDARD)

## Loader: Prototype function testers
class LOADER_DS:
    """
    LOADER to check various .ds file attributes

    """
    def load_specific(variable="[DEFAULT] "):
     """
     Run standard parse/load tests on 1 .ds file

     """
     PARSE_DS = """[EVENTTIME: """+\
      LogMaker_time()+"""] ["""+\
      LogMaker_date()+"""] ["""+\
      EVENT_TAG+"""] [LOADER_DS: DS SLUG]"""+\
      variable+"\n"
     with open(DIRLOG+LOG, 'a') as LOADER_DS:
      LOADER_DS.write(PARSE_DS)
     LOADER_DS.close()
     return(PARSE_DS)

    def load_all():
     """
     Run standard full tests on all .ds files: Default none specified

     """
     pass

def LOADER_ENTITY():
    """
    LOADER to check various .entity file attributes: Default none specified

    """
    def load_specific():
     """
     Run standard full tests on 1 .entity file: Default none specified

     """
     pass
    def load_all():
     """
     Run standard full tests on all .entity files: Default none specified

     """
     pass

def LOADER_ALL_FUNCTIONS():
    """
    See if each function can load and run: Default none specified

    """
    pass

## Random output generator to fill text
def GENERATE_FAKE_OUTPUT(variable=splat_fake_output):
    """
    Fake output writer:
    First, setup a digest after importing the hashlib object

    Populate it with random text: Encode it to bytes.

    Generate semi-random or static output to get the resulting random 'splat'

    """
    h = sha1()
    BYTES_variable = str.encode(variable)
    output_scrambler = BYTES_variable
    h.update(output_scrambler)
    print(h.hexdigest())

def UNITTEST_ALL(default="pass"):
    """
    You can put a single test or multiple tests here. You can also link
    to additional linked lists of tests.

    """
    if default == 1:
     print("[UNIT BLOCK: Testing]")
    else:
     # LOGGER
     UNITTEST_RUN__LOGGER = "[UNITTEST_RUN__LOGGER: "
     UNITTEST_RUN__helper = "performed a full UNITTEST]"
     print(UNITTEST_RUN__LOGGER+UNITTEST_RUN__helper+\
      ACTION_REGULAR(variable=UNITTEST_RUN__LOGGER+UNITTEST_RUN__helper)+\
       "\n")

class BaseTestData:
 """
 BaseTestData houses basic unittest modules and static file checks
 This needs to be built out with additional features as well moving forward,
 but some baseline functions are here now to build on.
 # SMEMBER= System Member
 # GVALUE = Group Value

 """
 def UNITTEST_ALL(default="pass"):
  # Example conditional tests
  SIX_ZERO_ZERO = "-rw-------"
  assert SIX_ZERO_ZERO == accessglobal.es
  assert ACCESS_GLOBAL_SMEMBER_NUMBER == 1062
  assert ACCESS_GLOBAL_GVALUE_NUMBER == 1062
  LAST_MODIFIED_VALUE = "May 4th"
  SIZE_VALUE = "2005"
  print("LAST MODIFIED ")

  # When this test runs it checks that each file adheres to your defined routine.
  ASSERT_IS_THERE_LIST = [
   # Base Passwords for local use
   "accessvalues.es",
   # Last loaded or cached DS loaded file
   "ACTIVE_ENTITY.es",
   # Defined shred-methods for when you transfer out old logs and sensitive data
   "CLEANEROFCOURSE_MANIFEST.es",
   # Network communications buckets for local and global comms
   "CONSTRUCTED_core.es", "core.es", "core_packages.es",
   # Read and Write 3D binary data-structures
   "data.c",
   # EntityScript style formatted data pipelines that can be collected, sorted,
   # and built out to new levels
   "ENTITY_MANIFEST.es",
   "HASH_DIRECTORY.es",
   # The default KEY_MATCH location for any external keys to eliminate the password
   "KEY.es",
   # General purpose log that's backed up according to your own routines and schedule
   # This log can also be parsed in the VisualDashboard Page in beta Version 1.1
   "log.es",
   # Your extensions of your mind. A second brain if you will.
   "ORE_LEDGER.es",
   # URL operations buckets that write, load, and store information for you
   "output_entity.es", "output_flush.es", "output_slug.es", "output_stucture.es",
   # BlockBuilder style custom data formats with an accounting example
   "output_vectorized_data.es",
   # Your default backup slug structure and defined folder builder routines on startup
   "STRUCTURE.es",
   # Network Credentials
   "vcnkey.es",
   # Key link folder to person directory
   "VISUAL.es"
  ]

  # Example conditional tests
  SIX_ZERO_ZERO = "-rw------"
  assert SIX_ZERO_ZERO == "accessglobal.es"
  assert LOG_DIR_SMEMBER_NUMBER == 1025
  assert LOG_DIR_GVALUE_NUMBER == 1025
  LAST_MODIFIED_VALUE = "November 10th"
  SIZE_VALUE = "1024"
  print("LAST MODIFIED "+LAST_MODIFIED_VALUE)

  # Run through alt cases here
  ASSERT_IS_THERE_SPECIAL_LIST = [
   DIRLOG
  ]



Return HOME