spark.repl.SparkILoop

SparkILoopInterpreter

class SparkILoopInterpreter extends SparkIMain

Linear Supertypes
SparkIMain, SparkImports, AnyRef, Any
Ordering
  1. Alphabetic
  2. By inheritance
Inherited
  1. Hide All
  2. Show all
  1. SparkILoopInterpreter
  2. SparkIMain
  3. SparkImports
  4. AnyRef
  5. Any
Visibility
  1. Public
  2. All

Instance Constructors

  1. new SparkILoopInterpreter()

Type Members

  1. case class ComputedImports(prepend: String, append: String, access: String) extends Product with Serializable

    Compute imports that allow definitions from previous requests to be visible in a new request.

  2. class ReadEvalPrint extends AnyRef

    Here is where we:

  3. class Request extends AnyRef

    One line of code submitted by the user for interpretation

Value Members

  1. final def !=(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  2. final def !=(arg0: Any): Boolean

    Definition Classes
    Any
  3. final def ##(): Int

    Definition Classes
    AnyRef → Any
  4. final def ==(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  5. final def ==(arg0: Any): Boolean

    Definition Classes
    Any
  6. def DBG(s: ⇒ String): Unit

    Definition Classes
    SparkIMain
  7. val SPARK_DEBUG_REPL: Boolean

    Definition Classes
    SparkIMain
  8. def addImports(ids: String*): Result

    Definition Classes
    SparkIMain
  9. def afterTyper[T](op: ⇒ T): T

    Definition Classes
    SparkIMain
  10. def aliasForType(path: String): Option[String]

    Parse the ScalaSig to find type aliases

    Parse the ScalaSig to find type aliases

    Definition Classes
    SparkIMain
  11. def allDefinedNames: List[Name]

    Definition Classes
    SparkIMain
  12. def allImplicits: List[Name]

    Definition Classes
    SparkIMain
  13. def allSeenTypes: List[String]

    Definition Classes
    SparkIMain
  14. final def asInstanceOf[T0]: T0

    Definition Classes
    Any
  15. def atPickler[T](op: ⇒ T): T

    Definition Classes
    SparkIMain
  16. def beQuietDuring[T](operation: ⇒ T): T

    Temporarily be quiet

    Temporarily be quiet

    Definition Classes
    SparkIMain
  17. def beSilentDuring[T](operation: ⇒ T): T

    Definition Classes
    SparkIMain
  18. def bind[T](name: String, value: T)(implicit arg0: Manifest[T]): Result

    Definition Classes
    SparkIMain
  19. def bind(p: NamedParam): Result

    Definition Classes
    SparkIMain
  20. def bind(name: String, boundType: String, value: Any): Result

    Bind a specified name to a specified value.

    Bind a specified name to a specified value. The name may later be used by expressions passed to interpret.

    name

    the variable name to bind

    boundType

    the type of the variable, as a string

    value

    the object value to bind to it

    returns

    an indication of whether the binding succeeded

    Definition Classes
    SparkIMain
  21. def bindValue(x: Any): Result

    Definition Classes
    SparkIMain
  22. def classLoader: AbstractFileClassLoader

    Definition Classes
    SparkIMain
  23. def classOfTerm(id: String): Option[JClass]

    Definition Classes
    SparkIMain
  24. val classServer: HttpServer

    Jetty server that will serve our classes to worker nodes

    Jetty server that will serve our classes to worker nodes

    Definition Classes
    SparkIMain
  25. def clearExecutionWrapper(): Unit

    Definition Classes
    SparkIMain
  26. def clone(): AnyRef

    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws()
  27. def close(): Unit

    This instance is no longer needed, so release any resources it is using.

    This instance is no longer needed, so release any resources it is using. The reporter's output gets flushed.

    Definition Classes
    SparkIMain
  28. def compileSources(sources: SourceFile*): Boolean

    Compile an nsc SourceFile.

    Compile an nsc SourceFile. Returns true if there are no compilation errors, or false otherwise.

    Definition Classes
    SparkIMain
  29. def compileString(code: String): Boolean

    Compile a string.

    Compile a string. Returns true if there are no compilation errors, or false otherwise.

    Definition Classes
    SparkIMain
  30. lazy val compilerClasspath: List[URL]

    the compiler's classpath, as URL's

    the compiler's classpath, as URL's

    Definition Classes
    SparkIMain
  31. def createLineManager(): Manager

    Attributes
    protected
    Definition Classes
    SparkILoopInterpreterSparkIMain
  32. def debugging[T](msg: String)(res: T): T

    Definition Classes
    SparkIMain
  33. def definedSymbols: Set[Symbol]

    Definition Classes
    SparkIMain
  34. def definedTerms: List[TermName]

    Definition Classes
    SparkIMain
  35. def definedTypes: List[TypeName]

    Definition Classes
    SparkIMain
  36. def definitionForName(name: Name): Option[MemberHandler]

    Definition Classes
    SparkIMain
  37. final def eq(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  38. def equals(arg0: Any): Boolean

    Definition Classes
    AnyRef → Any
  39. def executionWrapper: String

    Definition Classes
    SparkIMain
  40. def finalize(): Unit

    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws()
  41. def flatName(id: String): String

    Definition Classes
    SparkIMain
  42. lazy val formatting: Formatting

    Definition Classes
    SparkILoopInterpreterSparkIMain
  43. def generatedName(simpleName: String): Option[String]

    Given a simple repl-defined name, returns the real name of the class representing it, e.

    Given a simple repl-defined name, returns the real name of the class representing it, e.g. for "Bippy" it may return

    $line19.$read$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$Bippy

    Definition Classes
    SparkIMain
  44. final def getClass(): java.lang.Class[_]

    Definition Classes
    AnyRef → Any
  45. def getInterpreterClassLoader(): AbstractFileClassLoader

    Definition Classes
    SparkIMain
  46. lazy val global: Global

    the public, go through the future compiler

    the public, go through the future compiler

    Definition Classes
    SparkIMain
  47. def handleTermRedefinition(name: TermName, old: Request, req: Request): Unit

    Definition Classes
    SparkIMain
  48. def handleTypeRedefinition(name: TypeName, old: Request, req: Request): Unit

    Stubs for work in progress.

    Stubs for work in progress.

    Definition Classes
    SparkIMain
  49. def hashCode(): Int

    Definition Classes
    AnyRef → Any
  50. def implicitSymbols: List[Symbol]

    Definition Classes
    SparkImports
  51. def implicitSymbolsBySource: List[(Symbol, List[Symbol])]

    Definition Classes
    SparkImports
  52. def importHandlers: List[ImportHandler]

    Definition Classes
    SparkIMain
  53. def importedSymbols: List[Symbol]

    Definition Classes
    SparkImports
  54. def importedSymbolsBySource: List[(Symbol, List[Symbol])]

    Tuples of (source, imported symbols) in the order they were imported.

    Tuples of (source, imported symbols) in the order they were imported.

    Definition Classes
    SparkImports
  55. def importedTermNamed(name: String): Option[TermSymbol]

    Definition Classes
    SparkImports
  56. def importedTermSymbols: List[TermSymbol]

    Definition Classes
    SparkImports
  57. def importedTerms: List[TermName]

    Definition Classes
    SparkImports
  58. def importedTypeSymbols: List[TypeSymbol]

    Definition Classes
    SparkImports
  59. def importedTypes: List[TypeName]

    Definition Classes
    SparkImports
  60. def importsCode(wanted: Set[Name]): ComputedImports

    Attributes
    protected
    Definition Classes
    SparkImports
  61. def initialize(): Unit

    Definition Classes
    SparkIMain
  62. def interpret(line: String, synthetic: Boolean): Result

    Definition Classes
    SparkIMain
  63. def interpret(line: String): Result

    Interpret one line of input.

    Interpret one line of input. All feedback, including parse errors and evaluation results, are printed via the supplied compiler's reporter. Values defined are available for future interpreted strings.

    The return value is whether the line was interpreter successfully, e.g. that there were no parse errors.

    line

    ...

    returns

    ...

    Definition Classes
    SparkIMain
  64. def isInitializeComplete: Boolean

    Definition Classes
    SparkIMain
  65. final def isInstanceOf[T0]: Boolean

    Definition Classes
    Any
  66. def isParseable(line: String): Boolean

    Definition Classes
    SparkIMain
  67. lazy val isettings: SparkISettings

    interpreter settings

    interpreter settings

    Definition Classes
    SparkIMain
  68. def languageSymbols: List[Symbol]

    Definition Classes
    SparkImports
  69. def languageWildcardHandlers: List[ImportHandler]

    Definition Classes
    SparkImports
  70. def languageWildcardSyms: List[Symbol]

    Symbols whose contents are language-defined to be imported.

    Symbols whose contents are language-defined to be imported.

    Definition Classes
    SparkImports
  71. def languageWildcards: List[Type]

    Definition Classes
    SparkImports
  72. lazy val lineManager: Manager

    Definition Classes
    SparkIMain
  73. lazy val memberHandlers: SparkMemberHandlers { val intp: SparkILoopInterpreter.this.type }

    Definition Classes
    SparkIMain
  74. def mostRecentVar: String

    Returns the name of the most recent interpreter result.

    Returns the name of the most recent interpreter result. Mostly this exists so you can conveniently invoke methods on the previous result.

    Definition Classes
    SparkIMain
  75. object naming extends Naming

  76. final def ne(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  77. def newCompiler(settings: Settings, reporter: Reporter): Global

    Instantiate a compiler.

    Instantiate a compiler. Subclasses can override this to change the compiler class used by this interpreter.

    Attributes
    protected
    Definition Classes
    SparkIMain
  78. final def notify(): Unit

    Definition Classes
    AnyRef
  79. final def notifyAll(): Unit

    Definition Classes
    AnyRef
  80. def onlyTerms(xs: List[Name]): List[TermName]

    Attributes
    protected
    Definition Classes
    SparkIMain
  81. def onlyTypes(xs: List[Name]): List[TypeName]

    Attributes
    protected
    Definition Classes
    SparkIMain
  82. def optFlatName(id: String): Option[String]

    Definition Classes
    SparkIMain
  83. val out: PrintWriter

    Attributes
    protected
    Definition Classes
    SparkIMain
  84. val outputDir: File

    Local directory to save .

    Local directory to save .class files too

    Definition Classes
    SparkIMain
  85. def parentClassLoader: ClassLoader

    Attributes
    protected
    Definition Classes
    SparkILoopInterpreterSparkIMain
  86. def parse(line: String): Option[List[Tree]]

    Parse a line into a sequence of trees.

    Parse a line into a sequence of trees. Returns None if the input is incomplete.

    Definition Classes
    SparkIMain
  87. def pathToName(name: Name): String

    Definition Classes
    SparkIMain
  88. def pathToTerm(id: String): String

    Definition Classes
    SparkIMain
  89. def pathToType(id: String): String

    Definition Classes
    SparkIMain
  90. def prevRequestList: List[Request]

    Attributes
    protected
    Definition Classes
    SparkIMain
  91. var printResults: Boolean

    whether to print out result lines

    whether to print out result lines

    Definition Classes
    SparkIMain
  92. def quietBind(p: NamedParam): Result

    Definition Classes
    SparkIMain
  93. def quietImport(ids: String*): Result

    Definition Classes
    SparkIMain
  94. def quietRun[T](code: String): Result

    Definition Classes
    SparkIMain
  95. def rebind(p: NamedParam): Result

    Definition Classes
    SparkIMain
  96. def recordRequest(req: Request): Unit

    Definition Classes
    SparkIMain
  97. lazy val reporter: ConsoleReporter

    reporter

    reporter

    Definition Classes
    SparkIMain
  98. def reset(): Unit

    Reset this interpreter, forgetting all user-specified requests.

    Reset this interpreter, forgetting all user-specified requests.

    Definition Classes
    SparkIMain
  99. def resetClassLoader(): Unit

    Definition Classes
    SparkIMain
  100. def runtimeClassAndTypeOfTerm(id: String): Option[(JClass, Type)]

    Definition Classes
    SparkIMain
  101. def runtimeTypeOfTerm(id: String): Option[Type]

    Definition Classes
    SparkIMain
  102. def safeClass(name: String): Option[Symbol]

    Definition Classes
    SparkIMain
  103. def safeModule(name: String): Option[Symbol]

    Definition Classes
    SparkIMain
  104. def sessionImportedSymbols: List[Symbol]

    Definition Classes
    SparkImports
  105. def sessionWildcards: List[Type]

    Types which have been wildcard imported, such as: val x = "abc" ; import x.

    Types which have been wildcard imported, such as: val x = "abc" ; import x._ // type java.lang.String import java.lang.String._ // object java.lang.String

    Used by tab completion.

    XXX right now this gets import x._ and import java.lang.String._, but doesn't figure out import String._. There's a lot of ad hoc scope twiddling which should be swept away in favor of digging into the compiler scopes.

    Definition Classes
    SparkImports
  106. def setContextClassLoader(): Unit

    Definition Classes
    SparkIMain
  107. def setExecutionWrapper(code: String): Unit

    Definition Classes
    SparkIMain
  108. val settings: Settings

    Definition Classes
    SparkIMain
  109. def showCodeIfDebugging(code: String): Unit

    Definition Classes
    SparkIMain
  110. def symbolDefString(sym: Symbol): String

    Definition Classes
    SparkIMain
  111. def symbolOfTerm(id: String): Symbol

    Definition Classes
    SparkIMain
  112. final def synchronized[T0](arg0: ⇒ T0): T0

    Definition Classes
    AnyRef
  113. def toString(): String

    Definition Classes
    AnyRef → Any
  114. var totalSilence: Boolean

    whether to print errors

    whether to print errors

    Definition Classes
    SparkIMain
  115. def typeOfExpression(expr: String): Option[Type]

    Definition Classes
    SparkIMain
  116. def typeOfTerm(id: String): Option[Type]

    Definition Classes
    SparkIMain
  117. def unqualifiedIds: List[String]

    Another entry point for tab-completion, ids in scope

    Another entry point for tab-completion, ids in scope

    Definition Classes
    SparkIMain
  118. def valueOfTerm(id: String): Option[AnyRef]

    Definition Classes
    SparkIMain
  119. val virtualDirectory: PlainFile

    Scala compiler virtual directory for outputDir

    Scala compiler virtual directory for outputDir

    Definition Classes
    SparkIMain
  120. def visibleTermNames: List[Name]

    Definition Classes
    SparkIMain
  121. final def wait(): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws()
  122. final def wait(arg0: Long, arg1: Int): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws()
  123. final def wait(arg0: Long): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws()
  124. def wildcardTypes: List[Type]

    Definition Classes
    SparkImports
  125. def withoutBindingLastException[T](operation: ⇒ T): T

    Temporarily stop binding lastException

    Temporarily stop binding lastException

    Definition Classes
    SparkIMain
  126. def withoutUnwrapping(op: ⇒ Unit): Unit

    Definition Classes
    SparkIMain

Deprecated Value Members

  1. lazy val compiler: SparkILoopInterpreter.this.global.type

    Definition Classes
    SparkIMain
    Annotations
    @deprecated
    Deprecated

    (Since version 2.9.0) Use global for access to the compiler instance.

Inherited from SparkIMain

Inherited from SparkImports

Inherited from AnyRef

Inherited from Any