Skip to content

Commit

Permalink
Merge branch 'python3-dev'
Browse files Browse the repository at this point in the history
IMP: (DDL) Allow additonal downloading from mega, mediafire and pixeldrain links when using DDL GC.
IMP: (DDL -- config.ini) ddl_priority_order can be customized to set the priority order in which the DDL links will be attempted
IMP: (DDL) Will cycle thru ddl_priority_order in sequence attempting each link as it gets to it, until link is able to be downloaded or link exhaustion occurs
IMP: Don't post-process or scan extensions if provided in ignore_search_words
IMP: listAnnualSeries api endpoint added
IMP: (DDL) Additional add-on server entry for DDL
IMP: (DDL) Improved Pack parsing / detection
IMP: Dupechecker will now prioritize (F#)/(f#) in filenames as 'Fixed' over normally named files or lower numbered (f#)
IMP: Pack issue # tracking will track issues that belong to a particular pack so individually queueing issues within pack before it's completed, will not redownload issues belonging to the pack.
IMP: Added Link column to DDL Manage page (display which type of link was/is being used)
IMP: (DDL --> Manage) Adjusted column widths & spacing to correct some double-lines occurring
IMP: (DDL --> Manage) Show pack issue range for packs as opposed to showing the individual item that was being searched for
IMP: Notification of packs being downloaded will now show which DDL link was used to download as well as complete name of pack
IMP: Update current default user_agent string to a more current string
IMP: CT will now use user_agent string defined by Mylar as opposed to a random generic one on each request (if not updated, will force-update it accordingly)
IMP: Removed individual provider entries for nzbsu and dognzb from GUI, now treated as newznabs. If enabled, will be converted to newznab entry provided doesn't already exist as a newznab.
IMP: If CV indicates series year does not exist (ie. 2099) instead of using 2099, will use year of first issue or if not invalid, current year
IMP: When refreshing/adding a series that has a directory created with the incorrect year (2099), rename existing directory using correct year
IMP: Allow for future removal of outdated config.ini options with BAD_DEFINITIONS code
IMP: CV Api Key entry field on configuration page proper placeholder of None so will auto-remove value prior to typing in field
IMP: Add option to remove SABnzbd completed downloads from SABnzbd history
IMP: ASCII art logo added to startup sequence (cause why not, right?)
IMP: Test python module requirements prior to actually starting the program - failure will result in application exit with appropriate log messages
IMP: Basic configuration check started - comic location failure on existence/creation/permissions will throw a popup error until resolved
IMP: Filter input box on Watchlist page & searchresults page will accept exlusion filters preceded by (:-<term>)
IMP: (mylar3#1473) Weeklypull will honor ignore_publishers when displaying the weekly pullist
IMP: Wildcards (*) are acceptable in the ignore_publishers value in the config.ini (ie. panini*)
IMP: scheduled searches will have status checked prior to actually searching (cannot be in a Downloaded or Snatched status) unless it was initiated manually (ie. magnifying glass search)
IMP: Added Remove option to DDL table options so items that are stuck in a Start/Incomplete status can be removed
IMP: Detect and handle CV reusing existing ComicIDs via GUI Dialog (Keep / Delete) (see mylar3#1506)
IMP: Status color-coded legend to the bottom of the main watchlist/index page
IMP: format_booktype now set to True by default (allows use of $Type in folder format)
IMP: Post-Processing option is now enabled by default
IMP: default usenet_retention set to 3500 days.
IMP: default enforce_permissions set to False
IMP: default metatagging option cmtag_start_year_as_volume set to True
IMP: requirements.txt (added): pycryptodome, tenacity
FIX: Annuals were written to incorrect table sometimes resulting in status/filename being locked in db and unable to remove
FIX: Annuals that had the year present in the annual name and no issue number would not be recognized during series scans
FIX: (carepackage) when creating carepackage, if requirements were set to == would cause error if it was higher/lower versioning
FIX: Volume of series on cv > 10 would not be recognized as such when adding/refreshing series
FIX: Parsing of files that had part /volume would fail when part was part of the series title
FIX: One-shots with no issue number (but year is present) in filename will now be properly accounted for in series rechecks if booktype is set correctly
FIX: Removed some incorrect logger.warn references so that will log regardless of language
FIX: Error when removing directory that doesn't physically exist where it was recorded as being (delete series option)
FIX: (searchresults --> Edit) would not update comic location path when booktype was changed via dropdown
FIX: (searchresults --> Edit) GN was incorrectly labelled as GC within the booktype dropdown
FIX: Incorrect status colours for annuals when integration is enabled (Archived, Ignored)
FIX: Improper parsing of sabnzbd version in with newer versions would cause errors when checking / sending to sabnzbd
FIX: Remove depreciated imghdr module in lieu of using pillow (api.py)
FIX: Replace depreciated SafeConfigParser call from configparser when generating carepackage
FIX: Depreciation of pkg_resources - use packaging now (addition to requirements)
FIX: Ensure index page will load if provider_order is in error for some reason
FIX: Index page would display Unknown variations for published date when invalid instead of N/A
FIX: Ensure post-processing would be allowed for CV series that have been removed from CV but retained as is within Mylar
FIX: Mass publishers variable would sometimes be in an incorrect type (str) as opposed to type (list) which would cause the mass publishers job to not run
FIX: Selecting All checkbox on seriesdetail page will now select only relevant table (issues / annuals) as opposed to both
FIX: Discord notifications would always throw a could not send notification log message, but notif would be sent ok
FIX: When parsing results of DDL, if pack was detected and an invalid issue number was parsed, would traceback
FIX: DDL --> mediafire would try to get filesize of file that could possibly not exist due to failure resulting in traceback error
FIX: DDL --> traceback when count of links discovered would be more than indicated, not using any links discovered prior to the invalid count check
FIX: DDL --> traceback due to invalid variable reference (comicinfo[[0]['pack'])
FIX: DDL --> prefer upscaled option would incorrectly assume SD-Digital instead of just a normal/undefined labelling of said issue
FIX: Removal of () in folder structure
FIX: Link to CV for items within collected editions
FIX: Searchresults filter would filter everything if used
FIX: Casing when invoking updater.dbUpdate (@cwar)
FIX: (mylar3#1510) Adding by comicid would result in error in GUI, but series would still add successfully.
FIX: (mylar3#1508) Typo in logger when viewing a Paused series that contained annuals
FIX: (mylar3#1507) .Black and .White numbering exceptions added
FIX: When popup appeared, would open all hidden/minimized options in config page if tab was open
FIX: Infinity numbering vs Infinity in series name fix
FIX: Would fail to set selection when previous failures resulted in main link being used
FIX: Remove some additional information from cleaned ini due to previous config additions
FIX:(mylar3#1502) sabnzbd version detection problem on startup re: versioning format
FIX: Numeric sort for issue number column on Watchlist and Wanted pages instead of alpha-based
FIX: Booktype will default to Print instead of None
FIX: Error when searching for items being returned no issue numbers
FIX: Ensure Minimum & Maximum filesize restrictions contain only numerics
FIX: StoryArcs
--> Search for Missing would fail during post-processing due to invalid variable reference in db table
--> Search for Missing would use invalid issueid when attempting to post-process when downloading resulting in traceback
--> Search for Missing will now mark issues belonging to watchlisted series as Wanted if not already in that status
--> Search for Missing option now indicates what it does on mouseover
FIX: Feedparser would blow up (traceback error) when parsing non-digital dates in some cases
FIX: StoryArcs -> Search For Missing button would traceback when initiating the search
FIX: series.json creation as per schema (update to v1.0.2), FIX: proper reload of issues table on regeneration of json
FIX: Subset DDL site(s) to better handle failures/failed marking
FIX: Folder monitor would die on lock, FIX: spamming pack exclusions
FIX: html_cache in cache_dir for GC html files so won't redownload html file until item is successfully downloaded.
FIX: html_cache location added to cleanup_cache section
FIX: Better handling of corrupt images being retrieved from CV (will retry alternate image size if available)
FIX: dbupdater would error when attempting to merge previous_failed_id info into a non-existent dictionary
FIX: Wanted page may not display if IssueNumber field is blank or some other field is empty. Will attempt to default to ComicName.
FIX: DDL ReQueue / Restart would throw a red popup warning due to invalid reference (One-Shot) on Manage DDL page
FIX: DDL  -> If it was the first link used, would cause a traceback when clearing the queue of the id
FIX: DDL --> If it was a retry/requeue, would error on attempting to preload a variable that's not really needed (one-shot) / site name
FIX: DDL --> When initiating a ReQueue/Restart, the resume option will only be attempted if it a main server GC link
  • Loading branch information
evilhero committed Mar 14, 2024
2 parents efff50d + 099ab25 commit ca4a373
Show file tree
Hide file tree
Showing 52 changed files with 4,669 additions and 891 deletions.
99 changes: 99 additions & 0 deletions Mylar.py
Original file line number Diff line number Diff line change
Expand Up @@ -21,9 +21,108 @@
import re
import threading
import signal
import importlib

sys.path.insert(1, os.path.join(os.path.dirname(__file__), 'lib'))

class test_the_requires(object):

def __init__(self):
if hasattr(sys, 'frozen'):
full_path = os.path.abspath(sys.executable)
else:
full_path = os.path.abspath(__file__)

prog_dir = os.path.dirname(full_path)
data_dir = prog_dir
if len(sys.argv) > 0:
ddir = [x for x in sys.argv if 'datadir' in sys.argv]
if ddir:
ddir = re.sub('--datadir=', ''.join(ddir)).strip()
data_dir = re.sub('--datadir ', ddir).strip()

docker = False
d_path = '/proc/self/cgroup'
if os.path.exists('/.dockerenv') or 'KUBERNETES_SERVICE_HOST' in os.environ or os.path.isfile(d_path) and any('docker' in line for line in open(d_path)):
print('[DOCKER-AWARE] Docker installation detected.')
docker = True

self.req_file_present = True
self.reqfile = os.path.join(data_dir, 'requirements.txt')

if any([docker, data_dir != prog_dir]) and not os.path.isfile(self.reqfile):
self.reqfile = os.path.join(prog_dir, 'requirements.txt')

if not os.path.isfile(self.reqfile):
self.req_file_present = False

self.nfo_file = os.path.join(data_dir, 'ascii_logo.nfo')
if not self.nfo_file:
self.nfo_file = os.path.join(prog_dir, 'ascii_logo.nfo')

if not self.nfo_file:
print('[WARNING] Unable to load ascii_logo. You\'re missing something cool...')
else:
with open(self.nfo_file, 'r') as f:
for line in f:
print(line.rstrip())
print(f'{"-":-<60}')

self.ops = ['==', '>=', '<=']
self.mod_list = {}
self.mappings = {'APScheduler': 'apscheduler',
'beautifulsoup4': 'bs4',
'Pillow': 'PIL',
'pycryptodome': 'Crypto',
'pystun': 'stun',
'PySocks': 'socks'}

def check_it(self):
if not self.req_file_present:
print('[REQUIREMENTS MISSING] Unable to locate requirements.txt in %s. Make sure it exists, or use --data-dir to specify location' % self.reqfile)
sys.exit()

with open(self.reqfile, 'r') as file:
for line in file.readlines():
operator = [x for x in self.ops if x in line]
if operator:
operator = ''.join(operator)
lf = line.find(operator)
module_name = line[:lf].strip()
module_version = line[lf+len(operator):].strip()
if module_name == 'requests[socks]':
self.mod_list['requests'] = module_version
module_name = 'PySocks'
self.mod_list[module_name] = {'version': module_version, 'operator': operator}

failures = {}
for key,value in self.mod_list.items():
try:
module = key
if key in self.mappings:
module = self.mappings[key]
try:
importlib.import_module(module, package=None)
except Exception:
importlib.import_module(module.lower(), package=None)
except (ModuleNotFoundError) as e:
failures[key] = value

if failures:
if 'PySocks' in failures:
print('[MODULES UNAVAILABLE] Some modules are missing and may need to be installed via pip before proceeding. PySocks is required only if using proxies.')
else:
print('[MODULES UNAVAILABLE] Required modules are missing and need to be installed via pip before proceeding.')
print('[MODULES UNAVAILABLE] Reinstall each of the listed module(s) below or reinstall the included requirements.txt file')
print('[MODULES UNAVAILABLE] The following modules are missing:')
for modname, modreq in failures.items():
print('[MODULES UNAVAILABLE] %s %s%s' % (modname, modreq['operator'], modreq['version']))
if all(['PySocks' in failures, len(failures)>1]) or 'PySocks' not in failures:
sys.exit()

t = test_the_requires()
t.check_it()

import mylar

from mylar import (
Expand Down
6 changes: 6 additions & 0 deletions ascii_logo.nfo
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
__ _ _____ __
_| _| _ __ ___ _ _| | __ _ _ _|___ / |_ |_
(_) | | '_ ` _ \| | | | |/ _` | '__||_ \ | (_)
_ _ _| | | | | | | | |_| | | (_| | | ___) | | |_ _ _
(_|_|_) | |_| |_| |_|\__, |_|\__,_|_| |____/ | (_|_|_)
|__| |___/ |__|
53 changes: 26 additions & 27 deletions data/interfaces/carbon/css/style.css
Original file line number Diff line number Diff line change
Expand Up @@ -1376,7 +1376,6 @@ div#artistheader h2 a {
vertical-align: middle;
}
#series_table {
background-color: #FFF;
padding: 20px;
width: 960px !important;
}
Expand Down Expand Up @@ -1430,19 +1429,16 @@ div#artistheader h2 a {
text-align: center;
vertical-align: middle;
font-size: 12px;
background-color: #353a41;
}
#series_table td#name {
min-width: 290px;
text-align: center;
vertical-align: middle;
background-color: #353a41;
}
#series_table td#year {
max-width: 25px;
text-align: center;
vertical-align: middle;
background-color: #353a41;
}
#series_table td#havepercent,
#series_table td#totalcount {
Expand All @@ -1461,31 +1457,26 @@ div#artistheader h2 a {
max-width: 35px;
text-align: center;
vertical-align: middle;
background-color: #353a41;
}
#series_table td#issue {
max-width: 30px;
text-align: center;
vertical-align: middle;
background-color: #353a41;
}
#series_table td#status {
max-width: 50px;
text-align: center;
vertical-align: middle;
background-color: #353a41;
}
#series_table td#published {
max-width: 55px;
text-align: center;
vertical-align: middle;
background-color: #353a41;
}
#series_table td#have {
max-width: 80px;
text-align: center;
vertical-align: middle;
background-color: #353a41;
}
#manageheader {
margin-top: 45px;
Expand Down Expand Up @@ -1807,60 +1798,60 @@ div#artistheader h2 a {
min-width: 95px;
vertical-align: middle;
}
#queue_table th#qcomicid {
max-width: 10px;
text-align: center;
}
#queue_table th#qseries {
max-width: 475px;
max-width: 200px;
text-align: center;
}
#queue_table th#qsize {
max-width: 35px;
max-width: 40px;
text-align: center;
}
#queue_table th#qlinktype {
max-width: 45px;
text-align: center;
}
#queue_table th#qprogress {
max-width: 25px;
max-width: 20px;
text-align: center;
}
#queue_table th#qstatus {
max-width: 55px;
text-align: center;
}
#queue_table th#qdate {
max-width: 90px;
max-width: 75px;
text-align: center;
}
#queue_table th#qoptions {
max-width: 160px;
max-width: 100px;
text-align: center;
}
#queue_table td#qcomicid {
max-width: 10px;
text-align: left;
}
#queue_table td#qseries {
max-width: 475px;
max-width: 200px;
text-align: left;
}
#queue_table td#qsize {
max-width: 35px;
max-width: 40px;
text-align: center;
}
#queue_table td#qlinktype {
max-width: 45px;
text-align: center;
}
#queue_table td#qprogress {
max-width: 25px;
max-width: 20px;
text-align: center;
}
#queue_table td#qstatus {
max-width: 55px;
text-align: center;
}
#queue_table td#qdate {
min-width: 90px;
max-width: 75px;
text-align: center;
}
#queue_table td#qoptions {
max-width: 160px;
max-width: 100px;
text-align: center;
}

Expand Down Expand Up @@ -2601,3 +2592,11 @@ img.mylarload.lastbad {
.py_vers {
color: #EE3232;
}
.cv_row {
display: flex;
text-align: center;
}
.cv_column {
flex: 50%;
text-align: center;
}
59 changes: 41 additions & 18 deletions data/interfaces/default/base.html
Original file line number Diff line number Diff line change
Expand Up @@ -123,23 +123,34 @@
<br>
<%
mylar.PROVIDER_STATUS = {}
for ko, vo in sorted(mylar.CONFIG.PROVIDER_ORDER.items()):
mylar.PROVIDER_STATUS.update({vo : 'success'})
for kb in mylar.PROVIDER_BLOCKLIST:
if vo == kb['site']:
mylar.PROVIDER_STATUS.update({vo : 'fail'})
break
prov_fail = False
try:
for ko, vo in sorted(mylar.CONFIG.PROVIDER_ORDER.items()):
mylar.PROVIDER_STATUS.update({vo : 'success'})
for kb in mylar.PROVIDER_BLOCKLIST:
if vo == kb['site']:
mylar.PROVIDER_STATUS.update({vo : 'fail'})
break
except Exception:
prov_fail = True
%>
Providers:
%for prov, stats in sorted(mylar.PROVIDER_STATUS.items()):
<%
if stats == 'success':
st_image = '<img src="images/success.png" height="8" width="8">'
else:
st_image = '<img src="images/x_red.png" height="8" width="8">'
%>
${prov}:${st_image} &nbsp
%endfor
%if prov_fail:
<span title="This should correct itself eventually"><img src="images/x_red.png" height="8" width="8" />Unable to determine Provider Status' at this time...</span>
%else:
%for prov, stats in sorted(mylar.PROVIDER_STATUS.items()):
<%
if stats == 'success':
st_image = '<img src="images/success.png" height="8" width="8">'
else:
st_image = '<img src="images/x_red.png" height="8" width="8">'
%>
${prov}:${st_image} &nbsp
%endfor
%endif
</div>
<div id="config_messages_dialog" title="configuration checks / warnings" style="display:none">
<span id="mbody"></span>
</div>
<div style="text-align:right;position:relative;top:${bottom_foot}">
<a id="notifs" title="View Latest Notifications" href="javascript:void(0)" onclick="manageNotifications()"><img src="images/notif.png" height="31" width="25" /></a>
Expand Down Expand Up @@ -269,6 +280,18 @@
}
}, false);

evtSource.addEventListener("config_check", function(e){
if (e.data){
var data = JSON.parse(e.data);
console.log('config_check:'+data.status,data.message);
$("#mbody").html(data.message);
$("#config_messages_dialog").dialog({
modal: true,
width: "50%",
});
}
}, false);

evtSource.addEventListener("check_update", function(e){
if (e.data){
var data = JSON.parse(e.data);
Expand Down Expand Up @@ -339,15 +362,15 @@
if (data.status == 'success'){
$('#ajaxMsg').addClass('success').fadeIn().delay(3000).fadeOut();
console.log('data.comicid:'+data.comicid)
if ( (data.tables == 'both' || data.tables == 'tables') && ( document.body.innerHTML.search(data.comicid) || tt.value == 'history' || tt.value == 'search_results') ){
if ( ( tt.value != "config" ) && (data.tables == 'both' || data.tables == 'tables') && ( document.body.innerHTML.search(data.comicid) || tt.value == "history" || tt.value == "search_results") ){
console.log('reloading table1...');
reload_table();
}
else if ( (data.tables == 'both' || data.tables == 'tables') && (data.comicid == cid) ){
console.log('reloading table2...');
reload_table();
}
if (data.tables == 'both' || data.tables == 'tabs'){
if ( (data.tables == 'both' || data.tables == 'tabs') && ( tt.value != "config") ) {
reload_tabs();
}
if( data.comicid == cid && document.getElementById("page_name").value == 'series_detail'){
Expand Down Expand Up @@ -409,7 +432,7 @@
var tables = $('table.display').DataTable();
var tt = document.getElementById("page_name");
if(typeof(tt) != 'undefined' && tt != null){
if (tt.value != "weekly" && tt.value != "import_results" && tt.value != "manage_comics" && tt.value != "manage_issues" && tt.value != "manage_failed" && tt.value != "reading_list" && tt.value != "storyarcs_index" && tt.value != "storyarc_detail") {
if (tt.value != "weekly" && tt.value != "import_results" && tt.value != "manage_comics" && tt.value != "manage_issues" && tt.value != "manage_failed" && tt.value != "reading_list" && tt.value != "storyarcs_index" && tt.value != "storyarc_detail" && tt.value != "config") {
// this is required so it doesn't error if on the weekly page
// once weekly & other pages are converted to dynamic loading, this if can be removed
tables.ajax.reload(null,false);
Expand Down
Loading

0 comments on commit ca4a373

Please sign in to comment.