Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Statistical feature #44

Open
wants to merge 40 commits into
base: master
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
40 commits
Select commit Hold shift + click to select a range
942f6e0
Introductory changes to DockerFile
Sharabesh Jul 2, 2017
bf25f13
Merge branch 'master' of https://github.com/OpenNeuroLab/brainspell-neo
Sharabesh Jul 2, 2017
c977eab
Docker now works with compose commands and launches an empty database…
Sharabesh Jul 2, 2017
181f040
starting on significance tests for collections; wrote skeleton code a…
neelsomani Jul 3, 2017
01ded98
clarified details on implementing significance test
neelsomani Jul 3, 2017
e716a61
just need to implement comparison of binomial distributions and BH FDR
neelsomani Jul 3, 2017
e3deebb
tested boolean accumulators for collections
neelsomani Jul 3, 2017
6af4c09
add radius parameter
neelsomani Jul 3, 2017
5bc5744
finished transform to z scores
neelsomani Jul 3, 2017
70b2fcb
filter with benjamini hochberg
neelsomani Jul 3, 2017
cc35636
python3.4 fix
neelsomani Jul 3, 2017
eb94e3c
Merge branch 'master' of https://github.com/OpenNeuroLab/brainspell-neo
Sharabesh Jul 4, 2017
04d8e9a
make significance test asynchronous
neelsomani Jul 4, 2017
0a7b977
removed todo for async
neelsomani Jul 4, 2017
d9de8c0
removed python3.4 because of scipy errors
neelsomani Jul 4, 2017
1628eb4
added widgets to show significant coordinates from collection signifi…
neelsomani Jul 4, 2017
7903ffd
Merge branch 'master' of https://github.com/OpenNeuroLab/brainspell-neo
Sharabesh Jul 4, 2017
64578ac
removed extra JS
neelsomani Jul 4, 2017
e419a6a
explicitly replace spaces on search page
neelsomani Jul 4, 2017
40b36d1
fix inaccurate comment
neelsomani Jul 4, 2017
8ac4ad6
Updated UI features for account page. Allowed introduction of additio…
Sharabesh Jul 4, 2017
7793af4
resolved merge conflict
Sharabesh Jul 4, 2017
acdd3da
fixed bug
neelsomani Jul 4, 2017
1df8d1a
Fixed remaining merge conflict setting
Sharabesh Jul 4, 2017
c394d88
Updated significance
Sharabesh Jul 4, 2017
f1278de
fixed broken js call
neelsomani Jul 4, 2017
08d3249
merge
neelsomani Jul 4, 2017
fb15247
minor corrections
neelsomani Jul 4, 2017
d5153dd
Updated API to support in place voting without refreshing page
Sharabesh Jul 4, 2017
6c458ae
Merge branch 'master' of https://github.com/OpenNeuroLab/brainspell-neo
Sharabesh Jul 4, 2017
bd89b99
Correcting for style
Sharabesh Jul 4, 2017
af28bd6
clean up article helpers; working on fixing front end vote toggle
neelsomani Jul 4, 2017
d7c231d
revert broken UI change
neelsomani Jul 4, 2017
b011c23
fix name for pgsql file
neelsomani Jul 5, 2017
92a3431
change threshold back to the correct level, because the significance …
neelsomani Jul 6, 2017
2159169
added re
neelsomani Jul 7, 2017
1558792
adding regular expression re import
jbpoline Jul 7, 2017
7543fee
Merge pull request #1 from jbpoline/statistical-feature
neelsomani Jul 7, 2017
7dc6138
set initial total samples to 0; github fix
neelsomani Jul 7, 2017
e125200
add indicator for no results
neelsomani Jul 8, 2017
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
5 changes: 1 addition & 4 deletions .travis.yml
Original file line number Diff line number Diff line change
@@ -1,6 +1,5 @@
language: python
python:
- "3.4"
- "3.5"
# command to install dependencies
install: "pip install -r requirements.txt"
Expand All @@ -9,10 +8,8 @@ env:
- DATABASE_URL="postgres://yaddqlhbmweddl:SxBfLvKcO9Vj2b3tcFLYvLcv9m@ec2-54-243-47-46.compute-1.amazonaws.com:5432/d520svb6jevb35" COOKIE_SECRET="password"
# command to run tests
script: cd json_api && pytest

# TODO: unless we can get sauce working properly, then remove
addons:
sauce_connect:
username: "brainspell"
access_key: "56abf217-04be-441a-b624-1f889a4e237f"


1 change: 1 addition & 0 deletions Dockerfile
Original file line number Diff line number Diff line change
@@ -1,5 +1,6 @@
FROM python:3
ADD . /brainspell-neo
ADD . /database_dumps/brainspell.pgsql
WORKDIR /brainspell-neo
EXPOSE 5000
ENV PATH /opt/conda/envs/brainspell/bin:$PATH
Expand Down
12 changes: 9 additions & 3 deletions docker-compose.yml
Original file line number Diff line number Diff line change
@@ -1,8 +1,14 @@
version: '2'

services:
db:
image: postgres
web:
build: .
ports:
- "5000:5000"
command: python3 json_api/brainspell.py
volumes:
- .:/brainspell-neo
- .:/brainspell-neo
ports:
- "5000:5000"
depends_on:
- db
21 changes: 4 additions & 17 deletions ideas.txt
Original file line number Diff line number Diff line change
@@ -1,27 +1,14 @@
Consider:
- add-article-manual endpoint, which allows users to add articles that aren’t on PubMed. (potentially make PMID optional)
- Make "add article" UI.
- Maybe an endpoint to get the titles (or all of the information) for a set of PMIDs.
- A validation for articles before they're sent to the bulk-add endpoint.
- A cron job to automatically update DOIs.


Github Functionality
- Check in search and random whether an article is already in a collection
- Store Collection names and articles associated in database
- Use that to generate Widgets
- Integrate User pages
- Store collections in userId in the database
- Store the PMID’s in each collection as well.
- Integrate the two login systems
- Add notes in each Article in the collection
GitHub:
- Potentially add notes in each article in the collection
- Paginate the collections page
- The bar should be a database request
- The bar should be a database request (what does this mean?)

Potentially reimplement:
- Tables: reimplement using jQuery Datatables

- Brain Browser: a tool for visualization of translucent images that can take advantage of multi-core systems

- Continuous Activation Graphic:
- Continuous Activation Graphic
- Brainsprite: a tool that uses PNG files to show activation with depth
Binary file added json_api/.DS_Store
Binary file not shown.
129 changes: 69 additions & 60 deletions json_api/article_helpers.py
Original file line number Diff line number Diff line change
Expand Up @@ -6,15 +6,28 @@
import Bio
from Bio import Entrez, Medline
from Bio.Entrez import efetch, esearch, parse, read
from torngithub import json_encode

from models import *
from search_helpers import get_article_object

Entrez.email = "[email protected]"

# BEGIN: article helper functions


def get_article_object(query):
""" Get a single article PeeWee object. """

search = Articles.select().where(Articles.pmid == query)
return search.execute()


def get_all_articles():
""" Get all article objects in the database. """

return Articles.select().execute()


def update_authors(pmid, authors):
""" Update the authors for an article. """

Expand Down Expand Up @@ -79,10 +92,9 @@ def toggle_vote(pmid, topic, username, direction):
direction,
"name")

query = Articles.update(
Articles.update(
metadata=metadata).where(
Articles.pmid == pmid)
query.execute()
Articles.pmid == pmid).execute()


def vote_stereotaxic_space(pmid, space, username):
Expand All @@ -107,10 +119,9 @@ def vote_stereotaxic_space(pmid, space, username):
"type": space
})

query = Articles.update(
Articles.update(
metadata=target).where(
Articles.pmid == pmid)
query.execute()
Articles.pmid == pmid).execute()


def vote_number_of_subjects(pmid, subjects, username):
Expand All @@ -135,26 +146,24 @@ def vote_number_of_subjects(pmid, subjects, username):
"value": subjects
})

query = Articles.update(
Articles.update(
metadata=target).where(
Articles.pmid == pmid)
query.execute()
Articles.pmid == pmid).execute()


def add_user_tag(user_tag, id):
def add_user_tag(user_tag, pmid):
""" Add a custom user tag to the database. """

main_target = next(
Articles.select(
Articles.metadata).where(
Articles.pmid == id).execute())
Articles.pmid == pmid).execute())
target = eval(main_target.metadata)
if target.get("user"):
target["user"].append(user_tag)
else:
target["user"] = [user_tag]
query = Articles.update(metadata=target).where(Articles.pmid == id)
query.execute()
Articles.update(metadata=target).where(Articles.pmid == pmid).execute()


def get_number_of_articles():
Expand All @@ -170,45 +179,48 @@ def add_pmid_article_to_database(article_id):
Given a PMID, use external APIs to get the necessary article data
in order to add the article to our database.
"""

pmid = str(article_id)
handle = efetch("pubmed", id=[pmid], rettype="medline", retmode="text")
records = list(Medline.parse(handle))
records = records[0]
article_info = {}
article_info["title"] = records.get("TI")
article_info["PMID"] = pmid
article_info["authors"] = ', '.join(records.get("AU"))
article_info["abstract"] = records.get("AB")
article_info["DOI"] = getDOI(records.get("AID"))
article_info["experiments"] = ""
article["metadata"] = str({"meshHeadings": []})
article["reference"] = None
identity = ""
try:
article_info["experiments"] = {
"locations": eval(
urllib.request.urlopen(
"http://neurosynth.org/api/studies/peaks/" +
str(pmid) +
"/").read().decode())["data"]}
k = article_info["experiments"]["locations"]
for i in range(len(k)):
if len(k[i]) == 4:
identity = k[0]
k[i] = k[i][1:]
k[i] = ",".join([str(x) for x in (k[i])])
except BaseException:
pass
article_info["id"] = identity
article_info["experiments"] = [article_info["experiments"]]
Articles.create(abstract=article_info["abstract"],
authors=article_info["authors"],
doi=article_info["DOI"],
experiments=article_info["experiments"],
pmid=article_info["PMID"],
title=article_info["title"])
return article_info
if len(list(get_article_object(article_id))) == 0:
pmid = str(article_id)
handle = efetch("pubmed", id=[pmid], rettype="medline", retmode="text")
records = list(Medline.parse(handle))
records = records[0]
if "TI" not in records:
return False # catch bad PMIDs
article_info = {}
article_info["title"] = records["TI"]
article_info["PMID"] = pmid
article_info["authors"] = ', '.join(records["AU"])
article_info["abstract"] = records["AB"]
article_info["DOI"] = getDOI(records["AID"])
article_info["experiments"] = ""
article_info["metadata"] = str({"meshHeadings": []})
article_info["reference"] = None
identity = ""
try:
article_info["experiments"] = {
"locations": eval(
urllib.request.urlopen(
"http://neurosynth.org/api/studies/peaks/" +
str(pmid) +
"/").read().decode())["data"]}
k = article_info["experiments"]["locations"]
for i in range(len(k)):
if len(k[i]) == 4:
identity = k[0]
k[i] = k[i][1:]
k[i] = ",".join([str(x) for x in (k[i])])
except BaseException:
pass
article_info["id"] = identity
article_info["experiments"] = [article_info["experiments"]]
Articles.insert(abstract=article_info["abstract"],
authors=article_info["authors"],
doi=article_info["DOI"],
experiments=article_info["experiments"],
pmid=article_info["PMID"],
title=article_info["title"]).execute()
return True
return False


def getDOI(lst):
Expand Down Expand Up @@ -362,7 +374,7 @@ def add_coordinate_row(pmid, exp, coords, row_number=-1):
else:
elem["locations"].insert(row_number, row_list)
Articles.update(
experiments=experiments).where(
experiments=json_encode(experiments)).where(
Articles.pmid == pmid).execute()


Expand Down Expand Up @@ -398,9 +410,7 @@ def add_table_through_text_box(pmid, values):
def update_table_vote(tag_name, direction, table_num, pmid, column, username):
""" Update the vote on an experiment tag for a given user. """

article_obj = Articles.select(
Articles.experiments).where(
Articles.pmid == pmid).execute()
article_obj = get_article_object(pmid)
article_obj = next(article_obj)
article_obj = eval(article_obj.experiments)

Expand All @@ -418,7 +428,6 @@ def update_table_vote(tag_name, direction, table_num, pmid, column, username):

article_obj[table_num] = table_obj

query = Articles.update(
Articles.update(
experiments=article_obj).where(
Articles.pmid == pmid)
query.execute()
Articles.pmid == pmid).execute()
3 changes: 2 additions & 1 deletion json_api/base_handler.py
Original file line number Diff line number Diff line change
Expand Up @@ -93,7 +93,8 @@ def process(self, response, args)
the "asynchronous" boolean, decorating your "process" function with
@tornado.gen.coroutine, and calling self.finish_async when your
function finishes execution (MANDATORY). Then, any blocking code
should be decorated with @run_on_executor.
should be decorated with @run_on_executor. (Make sure that you import
run_on_executor with `from tornado.concurrent import run_on_executor`.)

asynchronous :: True | False

Expand Down
5 changes: 3 additions & 2 deletions json_api/github_collections.py
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,7 @@

import hashlib
import os
import re
from base64 import b64encode

import tornado
Expand All @@ -16,8 +17,8 @@
from tornado.httputil import url_concat
from torngithub import json_decode, json_encode

from article_helpers import get_article_object
from base_handler import *
from search_helpers import *
from user_account_helpers import *

# BEGIN: read environment variables
Expand Down Expand Up @@ -132,7 +133,7 @@ def get_user_repos(http_client, access_token):
access_token=access_token) for i in range(2, max_pages + 1)]

for repo in repos_list:
data.extend(res.body)
data.extend(repo.body)

raise tornado.gen.Return(data)

Expand Down
Loading