Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Some updates #67

Draft
wants to merge 21 commits into
base: master
Choose a base branch
from
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
64 changes: 64 additions & 0 deletions .github/workflows/master.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,64 @@
name: CI

on:
push:
branches: [ master ]
pull_request:
branches: [ master ]

jobs:
build:

runs-on: ubuntu-latest
strategy:
matrix:
python-version: [3.8, 3.11]

steps:
- uses: actions/checkout@v2

- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v2
with:
python-version: ${{ matrix.python-version }}

- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install -r requirements.txt
pip install .
pip install -r requirements-test.txt
pip install codecov
python -m spylon_kernel install --user
env:
JUPYTER_PLATFORM_DIRS: 1

- name: Test with pytest
run: |
python run_tests.py -vxrs --capture=sys --color=yes
python setup.py sdist
pip install --no-binary :all: dist/*.tar.gz
env:
JUPYTER_PLATFORM_DIRS: 1

- name: Cache pip
uses: actions/cache@v2
with:
path: ~/.cache/pip
key: ${{ runner.os }}-pip-${{ hashFiles('**/requirements.txt') }}
restore-keys: |
${{ runner.os }}-pip-

- name: Cache Spark
uses: actions/cache@v2
with:
path: ~/.cache/spark
key: ${{ runner.os }}-spark
restore-keys: |
${{ runner.os }}-spark-

- name: Codecov
if: success()
run: |
codecov

5 changes: 4 additions & 1 deletion .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -95,4 +95,7 @@ ENV/
README.rst

# Extra development notebooks
Untitled*.ipynb
Untitled*.ipynb

# sonar
.scannerwork
2 changes: 1 addition & 1 deletion requirements.txt
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
ipykernel
jedi<0.11
jedi
metakernel
spylon[spark]
tornado
2 changes: 1 addition & 1 deletion setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@
url='http://github.com/maxpoint/spylon-kernel',
install_requires=[
'ipykernel',
'jedi>=0.10',
'jedi',
'metakernel',
'spylon[spark]',
'tornado',
Expand Down
6 changes: 3 additions & 3 deletions spylon_kernel/init_spark_magic.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,13 +2,13 @@
import logging
import spylon.spark

from parso import split_lines
from metakernel import Magic, option
from .scala_interpreter import init_spark

try:
import jedi
from jedi.api.helpers import get_on_completion_name
from jedi import common
except ImportError as ex:
jedi = None

Expand Down Expand Up @@ -79,7 +79,7 @@ def get_completions(self, info):
position = (info['line_num'], info['column'])
interpreter = jedi.Interpreter(text, [self.env])

lines = common.splitlines(text)
lines = split_lines(text)
name = get_on_completion_name(
interpreter._get_module_node(),
lines,
Expand All @@ -89,4 +89,4 @@ def get_completions(self, info):
before = text[:len(text) - len(name)]
completions = interpreter.completions()
completions = [before + c.name_with_symbols for c in completions]
return [c[info['start']:] for c in completions]
return [c[info['start']:] for c in completions]
2 changes: 1 addition & 1 deletion spylon_kernel/scala_interpreter.py
Original file line number Diff line number Diff line change
Expand Up @@ -52,7 +52,7 @@ def init_spark(conf=None, capture_stderr=False):

# Create a temp directory that gets cleaned up on exit
output_dir = os.path.abspath(tempfile.mkdtemp())
def cleanup():
def cleanup(signalnum=None, frame=None):
shutil.rmtree(output_dir, True)
atexit.register(cleanup)
signal.signal(signal.SIGTERM, cleanup)
Expand Down
2 changes: 1 addition & 1 deletion spylon_kernel/scala_kernel.py
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ class SpylonKernel(MetaKernel):
implementation = 'spylon-kernel'
implementation_version = get_versions()['version']
language = 'scala'
language_version = '2.11'
language_version = '2.12'
banner = "spylon-kernel - evaluates Scala statements and expressions."
language_info = {
'mimetype': 'text/x-scala',
Expand Down
2 changes: 1 addition & 1 deletion test/test_scala_interpreter.py
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ def scala_interpreter(request):

def test_simple_expression(scala_interpreter):
result = scala_interpreter.interpret("4 + 4")
assert re.match('res\d+: Int = 8\n', result)
assert re.match(r'res\d+: Int = 8\n', result)


def test_completion(scala_interpreter):
Expand Down
2 changes: 1 addition & 1 deletion test/test_scala_kernel.py
Original file line number Diff line number Diff line change
Expand Up @@ -42,7 +42,7 @@ def test_simple_expression(spylon_kernel):
result = spylon_kernel.do_execute_direct("4 + 4")
assert isinstance(result, TextOutput)
output = result.output
assert re.match('res\d+: Int = 8\n', output)
assert re.match(r'res\d+: Int = 8\n', output)


def test_exception(spylon_kernel):
Expand Down
2 changes: 1 addition & 1 deletion test_spylon_kernel_jkt.py
Original file line number Diff line number Diff line change
Expand Up @@ -41,7 +41,7 @@ class SpylonKernelTests(jupyter_kernel_test.KernelTests):
'result': 'x: Int = 1\n'
}, {
'code': 'val y = 1 to 3',
'result': 'y: scala.collection.immutable.Range.Inclusive = Range(1, 2, 3)\n'
'result': 'y: scala.collection.immutable.Range.Inclusive = Range 1 to 3\n'
}]

spark_configured = False
Expand Down