NeuroBayes in and outside the Ivory Tower of High Energy Physics. Particle Physics Seminar, University Bonn, January 13, 2011

Size: px
Start display at page:

Download "NeuroBayes in and outside the Ivory Tower of High Energy Physics. Particle Physics Seminar, University Bonn, January 13, 2011"

Transcription

1 NeuroBayes in and outside the Ivory Tower of High Energy Physics Particle Physics Seminar, University Bonn, January 13, 2011 Michael IEKP KCETA Karlsruhe Feindt, Institute of Technology KIT, KIT Scientific Advisor Phi-T GmbH KIT University of the State of Baden-Wuerttemberg and Michael National Feindt Research Center NeuroBayes of the Helmholtz Association in and ouside the ivory tower of HEP Seminar Univ. Bonn, Jan. 13,

2 Agenda of this talk: What is NeuroBayes? Where does it come from? How to use NeuroBayes classification Robustness, speed, generalisation ability, ease-to-use NeuroBayes output is a Bayesian posterior probability How to find data-monte Carlo disagreements with NeuroBayes How to train NeuroBayes from data when no good MC model is available Examples of successful NeuroBayes applications in physics Full B reconstruction at B factories Examples from industry applications

3 History of NeuroBayes M.F. & co-workers: experience with NN in DELPHI, development of many packages: ELEPHANT, MAMMOTH, BSAURUS etc Invention of NeuroBayes algorithm 1997-now extensive use of NeuroBayes in CDF II NeuroBayes - specialisation for economy at the University of Karlsruhe, supported by BMBF 2002: Phi-T GmbH founded, industrial projects, further developments 2008: Foundation of sub-company Phi-T products & services, 2. office in Hamburg 2008-now: extensive use of NeuroBayes in Belle 2010: LHCb decides to use NeuroBayes massively to optimise reconstruction code Phi-T owns exclusive rights for NeuroBayes Staff (currently 40) almost all physicists (mainly from HEP) Continuous further development of NeuroBayes

4 Successful in competition with other data-miningmethods World s largest students competion: Data-Mining-Cup 2005: Fraud detection in internet trading 2006: price prediction in ebay auctions 2007: coupon redemption prediction 2008: lottery customer behaviour prediction

5 Since 2009: new rules: only up to 2 teams per University and 2009 Task: Prognosis of the turnaround of 8 books in 2500 book stores Winner: Uni Karlsruhe II with help of NeuroBayes and Task: Winner: Optimisation of individual customer care measures in online shop Uni Karlsruhe II with help of NeuroBayes

6 NeuroBayes task 1: Classifications Classification: Binary targets: Each single outcome will be yes or no NeuroBayes output is the Bayesian posterior probability that answer is yes (given that inclusive rates are the same in training and test sample, otherwise simple transformation necessary). Examples: > This elementary particle is a K meson. > This jet is a b-jet. > This three-particle combination is a D+. > This event is real data and not Monte Carlo. > This neutral B-meson was a particle and not an antiparticle at production time. > Customer Meier will cancel his contract next year.

7 NeuroBayes task 2: Conditional probability densities Probability density for real valued targets: For each possible (real) value a probability (density) is given. From that all statistical quantities like mean value, median, mode, standard deviation, etc. can be deduced. Deviations from normal distribution, e.g. crash probability Expectation value Standard deviation volatility Mode Examples: > Energy of an elementary particle (e.g a semileptonically decaying B meson with missing neutrino) > Q value (invariant mass) of a decay > Lifetime of a decay > Phi-direction of an inclusively reconstructed B-meson in a jet. > Turnaround of an article next year (very important in industrial applications)

8 One way to construct a one dimensional test statistic from multidimensional input (a MVA-method): Neural networks Self learning procedures, copied from nature Frontal Lobe Motor Cortex Parietal Cortex Temporal Lobe Brain Stem Occipital Lobe Cerebellum

9 Neural networks The NeuroBayes classification core is based on a simple feed forward neural network. The information (the knowledge, the expertise) is coded in the connections between the neurons. Each neuron performs fuzzy decisions. A neural network can learn from examples. Human brain: about 100 billion ( ) neurons about 100 trillion ( ) connections NeuroBayes : 10 to few 100 neurons

10 Neural Network basic functions

11 Neural network transfer functions

12 NeuroBayes classifications Input Preprocessing NeuroBayes Teacher: Learning of complex relationships from existing data bases (e.g. Monte Carlo) NeuroBayes Expert: Prognosis for unknown data Significance control Postprocessing Output

13 How it works: training and application Historic or simulated data Data set a =... b =... c = t =! Actual (new real) data Expert system Probability that hypothesis is correct (classification) or probability density for variable t Data set a =... b =... c = t =?

14 Neural network training Backpropagation (Rumelhardt et al. 1986): Calculate gradient backwards by applying chain rule Optimise using gradient descent method. Step size??

15 Neural network training Difficulty: find global minimum of highly non-linear function in high (~ >100) dimensional space. Imagine task to find deepest valley in the Alps (just 2 dimensions) Easy to find the next local minimum... but globally......impossible! needs good preconditioning

16 NeuroBayes strengths: NeuroBayes is a very powerful algorithm excellent generisability (does not overtrain) robust always finds good solution even with erratic input data fast automatically select significant variables output interpretable as Bayesian a posteriori probability can train with weights and background subtraction NeuroBayes is easy to use Examples and documentation available Good default values for all options fast start! Direct interface to TMVA available Introduction into root planned

17 <phi-t> NeuroBayes > is based on 2nd generation neural network algorithms, Bayesian regularisation, optimised preprocessing with non-linear transformations and decorrelation of input variables and linear correlation to output. > learns extremely fast due to 2nd order BFGS methods and even faster with 0-iteration mode. > produces small expertise files. > is extremely robust against outliers in input data. > is immune against learning by heart statistical noise. > tells you if there is nothing relevant to be learned. > delivers sensible prognoses already with small statistics. > can handle weighted events, even negative weights. > has advanced boost and cross validation features. > is steadily further developed professionally.

18 Bayes Theorem P(T D) P(D T), but Likelihood Prior Posterior Evidence NeuroBayes internally uses Bayesian arguments for regularisation NeuroBayes automatically makes Bayesian posterior statements

19

20 Teacher code fragment (1) #include "NeuroBayesTeacher.hh //create NeuroBayes instance NeuroBayesTeacher* nb = NeuroBayesTeacher::Instance(); const int nvar = 14; //number of input variables nb->nb_def_node1(nvar+1); nb->nb_def_node2(nvar); nb->nb_def_node3(1); nb->nb_def_task("cla"); nb->nb_def_iter(10); // nodes in input layer // nodes in hidden layer // nodes in output layer // binominal classification // number of training iterations nb->setoutputfile("bsdspiksk_expert.nb"); nb->setrootfile("bsdspiksk_expert.root"); // expertise file name // histogram file name

21 Teacher code fragment (2) // in training event loop nb->setweight(1.0); //set weight of event // set Target nb->settarget(0.0) ; // set Target, this event is BACKGROUND, else set to 1. InputArray[0] = GetValue(back,"BsPi.Pt"); // define input variables InputArray[1] = TMath::Abs(GetValue(back,"Bs.D0"));... nb->setnextinput(nvar,inputarray); //end of event loop nb->trainnet(); //perform training Many options existing, but this simple code usually already gives very good results.

22 Expert code fragment #include "Expert.hh"... Expert* nb = new Expert("../train/BsDsPiKSK_expert.nb",-2);... InputArray[0] = GetValue(signal,"BsPi.Pt"); InputArray[1] = TMath::Abs(GetValue(signal,"Bs.D0"));... Netout = nb->nb_expert(inputarray);

23 input variables ordered by relevance (standard deviations of additional information)

24

25 NeuroBayes training output (analysis file) NeuroBayes output distribution red:signal black: background Signal purity S/(S+B) in bins of NeuroBayes output. If on diagonal, then P=2*NBout+1 is the probability that the event actually is signal. This proves that NB always is well calibrated in the training.

26 NeuroBayes training output (analysis file) Purity vs. signal efficiency plot for different NeuroBayes output cuts. Should be as much in upper right corner as possible. The lower curve comes from cutting the wrong way round. Signal efficiency vs. total efficiency when cutting at different NeuroBayes outputs (lift chart). The area between blue curve and diagonal should be large. Physical region: white Right diagonal: events randomly sorted, no individualisation. Left diagonal border: completely correctly sorted, first all signal events, then all bg. Gini index: classification quality measure, The larger, the better.

27 NeuroBayes training output (analysis file) Correlation matrix of input variables. 1.row/column: training target

28 NeuroBayes training output (analysis file) Most important input variable significance: 78 standard deviations Accepted for the training Probability integral transformed input variable distribution: signal, background ( this is a binary variable!) Signal purity as function of the input variable (this case: unordered classes) Mean 0, width 1 transformation of signal purity oftransformed input variable Purity-efficiency plot of this variable compared to that of complete NeuroBayes

29 NeuroBayes training output (analysis file) 2.most important input variable, alone 67 standard deviations. But added after most important var taken into account only 11 sigma. Probability integral transformed input variable distribution: signal, background ( this is a largely continuous variable!) Signal purity as function of the input variable (this case: spline fit) Mean 0, width 1 transformation of (fitted) signal purity of input variable Purity-efficiency plot of this variable compared to that of complete NeuroBayes

30 NeuroBayes training output (analysis file) 39. most important input variable, alone 17 standard deviations, but only 0.6 sigma added after more significant variables. Ignored for the training Probability integral transformed input variable distribution: signal, background For 3339 events this input was not available (delta-function) Signal purity as function of the input variable (this case: spline fit + delta) Mean 0, width 1 transformation of (fitted) signal purity of input variable Due to the preprocessing 94 the delta is mapped to 0, not to its purity.

31 NeuroBayes output is a linear measure of the Bayesian posterior signal probability: P T (S) = (NB +1)/2 Signal to background ratio in training set: r T = S T If the training was performed with different S/B than actually present in expert dataset, one can transform the signal probability: P E (S) = P T (S) 1 r T r E B T, in expert set: r E = S E B E

32 Hunting data MC disagreements with NeuroBayes 1. Use data as signal, Monte Carlo as background. 2. Train NeuroBayes classification network. If MC model describes data well, nothing should be learned! 3. Look at most significant variables of this training. These give a hint where MC is not good. Could e.g. be in pt-spectrum or invariant mass spectrum (width). 4. Decide whether effects are due to physics modelling or detector resolution/efficiency. 5. Reweigh MC by w=(1+nbout)/(1-nbout) or produce a more realistic MC goto 1

33 Scenario: MC for signal available, but not for backgrounds Idea: take background from sidebands in data Check that network cannot learn mass (by training left sideband vs. right sideband: remove input variables until this net cannot learn anything more) Works well if data-mc agreement quite good

34 Scenario: Neither reliable signal nor background Monte Carlo available Idea: Training with background subtraction Signal: Peak region weight 1 Sideband region with weight -1 (statistical subtraction) Background: Sideband region with weight 1 works very well! also for Y(2S) and Y(3S)! Although just trained on Y(1S)

35 Example for data-only training (on 1. resonance)

36 NeuroBayes B s to J/ψ Φ selection without MC (2 stage background subtraction training process) all data soft cut on net 1, input to second NeuroBayes training soft preselection, input to first NeuroBayes training cut on net 2

37 Exploiting S/B information more efficiently : The splot-method Fit data signal and background in one distribution (e.g. mass). Compute splot weights w s for signal (may be <0 or >1) as function of mass from fit. Train NeuroBayes network with each event treated both as signal with signal weight w S and as background with weight 1-w S. Soft cut on output enriches S/B considerably: Make sure network cannot learn mass! (Paper in preparation)

38 More than 60 diploma and Ph.D. theses and many publications from experiments DELPHI, CDF II, AMS, CMS and Belle used NeuroBayes or predecessors very successfully. Also ATLAS and LHCb applications starting. Many of these can be found at Talks about NeuroBayes and applications: www-ekp.physik.uni-karlsruhe.de/~feindt Forschung

39 Some NeuroBayes highlights: Bs oscillations Discovery of excited Bs states X(3872) properties Single top quark production discovery High mass Higgs exclusion

40 Just a few examples NeuroBayes soft electron identification for CDF II Thesis U. Kerzel: on basis of Soft Electron Collection (much more efficient than cut selection or JetNet with same inputs) - after clever preprocessing by hand and careful learning parameter choice this could also be as good as NeuroBayes

41 Just a few examples NeuroBayes selection

42 Just a few examples First observation of B_s1 and most precise of B_s2* Selection using NeuroBayes

43 Belle B-factory ran very successfully KIT joined Belle Collaboration in 2008 and introduced NeuroBayes. Continuum subtraction Flavour tagging Particle ID S/B selection optim. Full B reconstruction NeuroBayes enhances efficiency of flavour tagging calibration reaction B-> D* l by 71% at same purity.

44 Physics at B factory (asymmetric e+e- collider at Y(4S) Y(4S) decays into 2 B mesons almost at rest in CMS. Decay products of 2 Bs not easily distinguishable. Many 1000 exclusive decay chains per B. Reconstruct as many as possible Bs exclusively (tag side). Then all other reconstructed particles belong to other B (signal side). And kinematics of signal side uniquely determined, allows missing mass reconstruction.

45 Hierarchical Full Reconstruction

46 Example D0 signals of very different purity with/without NB cut

47 Optimal combination of decay channels of very different purity using NeuroBayes outputs Precuts such that number of additional bg events per additional signal event is constant.

48 Full reconstruction of B mesons in 1042 decay chains. Hierarchical probabilistic reconstruction system with 71 NeuroBayes networks, fully automatic (NeuroBayes Factory) B+ efficiency increased by ~104 % at same (total) purity (corresponds to many years of additional data taking) using NeuroBayes classical algorithm

49 Alternatively one can make the sample cleaner, e.g. to the same background level: B+ efficiency increased by +88% at same background level. (Real data plots, about 8% of full data set) signal using NeuroBayes background signal (classical algorithm)

50 Alternatively one can make the sample much cleaner: e.g. at same signal efficiency as classical algorithm: B+ background suppression by factor of 17! background (classical algorithm) background using NeuroBayes signal

51 First application test on real data: Select B 0 D* + l ν on signal side, fully reco. B on tag side. Calculate missing mass squared on signal side. Peak at 0 from missing neutrino expected and seen. Efficiency more than doubled with new algorithm!

52 Flexibility: Working with NeuroBayes allows continuous choice of working point in purity-efficiency. NIM-paper in preparation.

53 customers & projects Very successful projects for: among others BGV and VKB car insurances Lupus Alpha Asset Management Otto Versand (mail order business) Thyssen Krupp (steel industry) AXA and Central health insurances dm drogerie markt (drugstore chain) Libri (book wholesale)... expanding

54 Individual risk prognoses for car insurances: Accident probability Cost probability distribution Large damage prognosis Contract cancellation prob. very successful at

55 Correlation among input variables, target color coded Ramler II-Plot 55

56 Contract cancellations in a large financial institute Real cancellation rate as function of cancellation rate predicted by NeuroBayes Very good performance within statistical errors

57

58 Near Future Turnaround Predictions for Chain Stores 1. Time series modelling 2. Correction and error estimate using NeuroBayes

59 Turnover prognosis for mail order business

60 Typical test results (always very successful) Colour codes: NeuroBayes better same worse than classical methods Trainings- Seasons Test-Seasons

61

62 Prognosis of individual health costs Pilot project for a large private health insurance Prognosis of costs in following year for each person insured with confidence intervals 4 years of training, test on following year Results: Probability density for each customer/tarif combination Kunde N Mann, 44 Tarif XYZ123 seit ca. 17 Jahre Very good test results! Has potential for a real and objective cost reduction in health management

63 Prognosis of financial markets VDI-Nachrichten, NeuroBayes based risk averse market neutral fonds for institutional investors. Fully automatic trading ( Mio, since Mio ) Lupus Alpha NeuroBayes Short Term Trading Fonds Test Test Börsenzeitung,

64 Licenses NeuroBayes is commercial software. All rights belong to Phi-T GmbH. It is not open source. CERN, Fermilab, KEK have licenses for use in high energy physics research. Expert runs without license (can run in the grid!) License only needed for training networks. For purchasing additional teacher licenses (for computers outside CERN) please contact Phi-T. Bindings to many programming languages exist. Code generator for easy usage exists.

65 Prognosis of sports events from historical data: NeuroNetz er Results: Probabilities for home - tie - guest

66 Documentation Basics: M. Feindt, A Neural Bayesian Estimator for Conditional Probability Densities, E-preprint-archive physics M. Feindt, U. Kerzel, The NeuroBayes Neural Network Package, NIM A 559(2006) 190 Web Sites: (Company web site, German & English) www-ekp.physik.uni-karlsruhe.de/~feindt (some NeuroBayes talks can be found here under -> Forschung) (English site on physics results with NeuroBayes & all diploma and PhD theses using NeuroBayes, and discussion forum and FAQ for usage in physics.. Please use this and also post your results here! )

67 The <phi-t> mouse game: or: even your ``free will is predictable //

From Delphi. to Phi-T

From Delphi. to Phi-T From Delphi to Phi-T Spin-Off from Physics Research to Business Prof. Dr. Michael Feindt KCETA - Centrum für Elementarteilchen- und Astroteilchenphysik IEKP, Universität Karlsruhe, Karlsruhe Institute

More information

NeuroBayes A modern analysis tool from High Energy Physics and its way as prognosis tool into business

NeuroBayes A modern analysis tool from High Energy Physics and its way as prognosis tool into business NeuroBayes A modern analysis tool from High Energy Physics and its way as prognosis tool into business Prof. Dr. Michael Feindt CETA - Centrum für Elementarteilchen- und Astroteilchenphysik Universität

More information

Automatic Definition of Planning Target Volume in Computer-Assisted Radiotherapy

Automatic Definition of Planning Target Volume in Computer-Assisted Radiotherapy Automatic Definition of Planning Target Volume in Computer-Assisted Radiotherapy Angelo Zizzari Department of Cybernetics, School of Systems Engineering The University of Reading, Whiteknights, PO Box

More information

Error Detection based on neural signals

Error Detection based on neural signals Error Detection based on neural signals Nir Even- Chen and Igor Berman, Electrical Engineering, Stanford Introduction Brain computer interface (BCI) is a direct communication pathway between the brain

More information

LHC Physics or How LHC experiments will use OSG resources

LHC Physics or How LHC experiments will use OSG resources LHC Physics or How LHC experiments will use OSG resources OSG Consortium Meeting 08 / 21 / 06 Oliver Gutsche USCMS / Fermilab Particle Physics Probe innermost structure and explain it from first principles

More information

Neurons and neural networks II. Hopfield network

Neurons and neural networks II. Hopfield network Neurons and neural networks II. Hopfield network 1 Perceptron recap key ingredient: adaptivity of the system unsupervised vs supervised learning architecture for discrimination: single neuron perceptron

More information

From Biostatistics Using JMP: A Practical Guide. Full book available for purchase here. Chapter 1: Introduction... 1

From Biostatistics Using JMP: A Practical Guide. Full book available for purchase here. Chapter 1: Introduction... 1 From Biostatistics Using JMP: A Practical Guide. Full book available for purchase here. Contents Dedication... iii Acknowledgments... xi About This Book... xiii About the Author... xvii Chapter 1: Introduction...

More information

Experimental Design. Outline. Outline. A very simple experiment. Activation for movement versus rest

Experimental Design. Outline. Outline. A very simple experiment. Activation for movement versus rest Experimental Design Kate Watkins Department of Experimental Psychology University of Oxford With thanks to: Heidi Johansen-Berg Joe Devlin Outline Choices for experimental paradigm Subtraction / hierarchical

More information

SUPPLEMENTARY INFORMATION In format provided by Javier DeFelipe et al. (MARCH 2013)

SUPPLEMENTARY INFORMATION In format provided by Javier DeFelipe et al. (MARCH 2013) Supplementary Online Information S2 Analysis of raw data Forty-two out of the 48 experts finished the experiment, and only data from these 42 experts are considered in the remainder of the analysis. We

More information

Introduction to Computational Neuroscience

Introduction to Computational Neuroscience Introduction to Computational Neuroscience Lecture 10: Brain-Computer Interfaces Ilya Kuzovkin So Far Stimulus So Far So Far Stimulus What are the neuroimaging techniques you know about? Stimulus So Far

More information

Introduction to Computational Neuroscience

Introduction to Computational Neuroscience Introduction to Computational Neuroscience Lecture 5: Data analysis II Lesson Title 1 Introduction 2 Structure and Function of the NS 3 Windows to the Brain 4 Data analysis 5 Data analysis II 6 Single

More information

PSYCH-GA.2211/NEURL-GA.2201 Fall 2016 Mathematical Tools for Cognitive and Neural Science. Homework 5

PSYCH-GA.2211/NEURL-GA.2201 Fall 2016 Mathematical Tools for Cognitive and Neural Science. Homework 5 PSYCH-GA.2211/NEURL-GA.2201 Fall 2016 Mathematical Tools for Cognitive and Neural Science Homework 5 Due: 21 Dec 2016 (late homeworks penalized 10% per day) See the course web site for submission details.

More information

UNIVERSITY of PENNSYLVANIA CIS 520: Machine Learning Final, Fall 2014

UNIVERSITY of PENNSYLVANIA CIS 520: Machine Learning Final, Fall 2014 UNIVERSITY of PENNSYLVANIA CIS 520: Machine Learning Final, Fall 2014 Exam policy: This exam allows two one-page, two-sided cheat sheets (i.e. 4 sides); No other materials. Time: 2 hours. Be sure to write

More information

Chapter 1. Introduction

Chapter 1. Introduction Chapter 1 Introduction 1.1 Motivation and Goals The increasing availability and decreasing cost of high-throughput (HT) technologies coupled with the availability of computational tools and data form a

More information

Lecture 3: Bayesian Networks 1

Lecture 3: Bayesian Networks 1 Lecture 3: Bayesian Networks 1 Jingpeng Li 1 Content Reminder from previous lecture: Bayes theorem Bayesian networks Why are they currently interesting? Detailed example from medical diagnostics Bayesian

More information

UNLOCKING VALUE WITH DATA SCIENCE BAYES APPROACH: MAKING DATA WORK HARDER

UNLOCKING VALUE WITH DATA SCIENCE BAYES APPROACH: MAKING DATA WORK HARDER UNLOCKING VALUE WITH DATA SCIENCE BAYES APPROACH: MAKING DATA WORK HARDER 2016 DELIVERING VALUE WITH DATA SCIENCE BAYES APPROACH - MAKING DATA WORK HARDER The Ipsos MORI Data Science team increasingly

More information

Model reconnaissance: discretization, naive Bayes and maximum-entropy. Sanne de Roever/ spdrnl

Model reconnaissance: discretization, naive Bayes and maximum-entropy. Sanne de Roever/ spdrnl Model reconnaissance: discretization, naive Bayes and maximum-entropy Sanne de Roever/ spdrnl December, 2013 Description of the dataset There are two datasets: a training and a test dataset of respectively

More information

Evaluation: Scientific Studies. Title Text

Evaluation: Scientific Studies. Title Text Evaluation: Scientific Studies Title Text 1 Evaluation Beyond Usability Tests 2 Usability Evaluation (last week) Expert tests / walkthroughs Usability Tests with users Main goal: formative identify usability

More information

Learning and Adaptive Behavior, Part II

Learning and Adaptive Behavior, Part II Learning and Adaptive Behavior, Part II April 12, 2007 The man who sets out to carry a cat by its tail learns something that will always be useful and which will never grow dim or doubtful. -- Mark Twain

More information

Introduction to Bayesian Analysis 1

Introduction to Bayesian Analysis 1 Biostats VHM 801/802 Courses Fall 2005, Atlantic Veterinary College, PEI Henrik Stryhn Introduction to Bayesian Analysis 1 Little known outside the statistical science, there exist two different approaches

More information

LHCb Outreach example Ben Couturier (CERN) on behalf of the LHCb Collaboration. Frontiers of Fundamental Physics 14 Marseille July 15th to 18th 2014

LHCb Outreach example Ben Couturier (CERN) on behalf of the LHCb Collaboration. Frontiers of Fundamental Physics 14 Marseille July 15th to 18th 2014 LHCb Outreach example Ben Couturier (CERN) on behalf of the LHCb Collaboration Marseille July 15th to 18th 2014 LHCb Collaboration 2 LHCb Detector Transverse Beam Single arm spectrometer Designed for the

More information

Detection of Lung Cancer Using Backpropagation Neural Networks and Genetic Algorithm

Detection of Lung Cancer Using Backpropagation Neural Networks and Genetic Algorithm Detection of Lung Cancer Using Backpropagation Neural Networks and Genetic Algorithm Ms. Jennifer D Cruz 1, Mr. Akshay Jadhav 2, Ms. Ashvini Dighe 3, Mr. Virendra Chavan 4, Prof. J.L.Chaudhari 5 1, 2,3,4,5

More information

Spatiotemporal clustering of synchronized bursting events in neuronal networks

Spatiotemporal clustering of synchronized bursting events in neuronal networks Spatiotemporal clustering of synchronized bursting events in neuronal networks Uri Barkan a David Horn a,1 a School of Physics and Astronomy, Tel Aviv University, Tel Aviv 69978, Israel Abstract in vitro

More information

Flexible, High Performance Convolutional Neural Networks for Image Classification

Flexible, High Performance Convolutional Neural Networks for Image Classification Flexible, High Performance Convolutional Neural Networks for Image Classification Dan C. Cireşan, Ueli Meier, Jonathan Masci, Luca M. Gambardella, Jürgen Schmidhuber IDSIA, USI and SUPSI Manno-Lugano,

More information

Learning in neural networks

Learning in neural networks http://ccnl.psy.unipd.it Learning in neural networks Marco Zorzi University of Padova M. Zorzi - European Diploma in Cognitive and Brain Sciences, Cognitive modeling", HWK 19-24/3/2006 1 Connectionist

More information

Application of Artificial Neural Networks in Classification of Autism Diagnosis Based on Gene Expression Signatures

Application of Artificial Neural Networks in Classification of Autism Diagnosis Based on Gene Expression Signatures Application of Artificial Neural Networks in Classification of Autism Diagnosis Based on Gene Expression Signatures 1 2 3 4 5 Kathleen T Quach Department of Neuroscience University of California, San Diego

More information

Gene Selection for Tumor Classification Using Microarray Gene Expression Data

Gene Selection for Tumor Classification Using Microarray Gene Expression Data Gene Selection for Tumor Classification Using Microarray Gene Expression Data K. Yendrapalli, R. Basnet, S. Mukkamala, A. H. Sung Department of Computer Science New Mexico Institute of Mining and Technology

More information

ERA: Architectures for Inference

ERA: Architectures for Inference ERA: Architectures for Inference Dan Hammerstrom Electrical And Computer Engineering 7/28/09 1 Intelligent Computing In spite of the transistor bounty of Moore s law, there is a large class of problems

More information

Sparse Coding in Sparse Winner Networks

Sparse Coding in Sparse Winner Networks Sparse Coding in Sparse Winner Networks Janusz A. Starzyk 1, Yinyin Liu 1, David Vogel 2 1 School of Electrical Engineering & Computer Science Ohio University, Athens, OH 45701 {starzyk, yliu}@bobcat.ent.ohiou.edu

More information

Predicting Breast Cancer Survivability Rates

Predicting Breast Cancer Survivability Rates Predicting Breast Cancer Survivability Rates For data collected from Saudi Arabia Registries Ghofran Othoum 1 and Wadee Al-Halabi 2 1 Computer Science, Effat University, Jeddah, Saudi Arabia 2 Computer

More information

FUSE TECHNICAL REPORT

FUSE TECHNICAL REPORT FUSE TECHNICAL REPORT 1 / 16 Contents Page 3 Page 4 Page 8 Page 10 Page 13 Page 16 Introduction FUSE Accuracy Validation Testing LBD Risk Score Model Details FUSE Risk Score Implementation Details FUSE

More information

MULTIPLE CHOICE. Choose the one alternative that best completes the statement or answers the question.

MULTIPLE CHOICE. Choose the one alternative that best completes the statement or answers the question. Statistics Final Review Semeter I Name MULTIPLE CHOICE. Choose the one alternative that best completes the statement or answers the question. Provide an appropriate response. 1) The Centers for Disease

More information

Bayesian Joint Modelling of Benefit and Risk in Drug Development

Bayesian Joint Modelling of Benefit and Risk in Drug Development Bayesian Joint Modelling of Benefit and Risk in Drug Development EFSPI/PSDM Safety Statistics Meeting Leiden 2017 Disclosure is an employee and shareholder of GSK Data presented is based on human research

More information

IT S ALL IN YOUR MIND

IT S ALL IN YOUR MIND An Elementary Brain Hat Activity Activity 1G: Atop Your Head Grades 4 & 5 Activity Description In this elementary level activity, students will discover the external anatomy of their brain as they create

More information

Ch.20 Dynamic Cue Combination in Distributional Population Code Networks. Ka Yeon Kim Biopsychology

Ch.20 Dynamic Cue Combination in Distributional Population Code Networks. Ka Yeon Kim Biopsychology Ch.20 Dynamic Cue Combination in Distributional Population Code Networks Ka Yeon Kim Biopsychology Applying the coding scheme to dynamic cue combination (Experiment, Kording&Wolpert,2004) Dynamic sensorymotor

More information

Outlier Analysis. Lijun Zhang

Outlier Analysis. Lijun Zhang Outlier Analysis Lijun Zhang zlj@nju.edu.cn http://cs.nju.edu.cn/zlj Outline Introduction Extreme Value Analysis Probabilistic Models Clustering for Outlier Detection Distance-Based Outlier Detection Density-Based

More information

Artificial Neural Networks (Ref: Negnevitsky, M. Artificial Intelligence, Chapter 6)

Artificial Neural Networks (Ref: Negnevitsky, M. Artificial Intelligence, Chapter 6) Artificial Neural Networks (Ref: Negnevitsky, M. Artificial Intelligence, Chapter 6) BPNN in Practice Week 3 Lecture Notes page 1 of 1 The Hopfield Network In this network, it was designed on analogy of

More information

Oscillatory Neural Network for Image Segmentation with Biased Competition for Attention

Oscillatory Neural Network for Image Segmentation with Biased Competition for Attention Oscillatory Neural Network for Image Segmentation with Biased Competition for Attention Tapani Raiko and Harri Valpola School of Science and Technology Aalto University (formerly Helsinki University of

More information

BACKPROPOGATION NEURAL NETWORK FOR PREDICTION OF HEART DISEASE

BACKPROPOGATION NEURAL NETWORK FOR PREDICTION OF HEART DISEASE BACKPROPOGATION NEURAL NETWORK FOR PREDICTION OF HEART DISEASE NABEEL AL-MILLI Financial and Business Administration and Computer Science Department Zarqa University College Al-Balqa' Applied University

More information

Artificial Intelligence Lecture 7

Artificial Intelligence Lecture 7 Artificial Intelligence Lecture 7 Lecture plan AI in general (ch. 1) Search based AI (ch. 4) search, games, planning, optimization Agents (ch. 8) applied AI techniques in robots, software agents,... Knowledge

More information

Artificial Neural Networks and Near Infrared Spectroscopy - A case study on protein content in whole wheat grain

Artificial Neural Networks and Near Infrared Spectroscopy - A case study on protein content in whole wheat grain A White Paper from FOSS Artificial Neural Networks and Near Infrared Spectroscopy - A case study on protein content in whole wheat grain By Lars Nørgaard*, Martin Lagerholm and Mark Westerhaus, FOSS *corresponding

More information

Probability-Based Protein Identification for Post-Translational Modifications and Amino Acid Variants Using Peptide Mass Fingerprint Data

Probability-Based Protein Identification for Post-Translational Modifications and Amino Acid Variants Using Peptide Mass Fingerprint Data Probability-Based Protein Identification for Post-Translational Modifications and Amino Acid Variants Using Peptide Mass Fingerprint Data Tong WW, McComb ME, Perlman DH, Huang H, O Connor PB, Costello

More information

Introduction to Machine Learning. Katherine Heller Deep Learning Summer School 2018

Introduction to Machine Learning. Katherine Heller Deep Learning Summer School 2018 Introduction to Machine Learning Katherine Heller Deep Learning Summer School 2018 Outline Kinds of machine learning Linear regression Regularization Bayesian methods Logistic Regression Why we do this

More information

Trip generation: comparison of neural networks and regression models

Trip generation: comparison of neural networks and regression models Trip generation: comparison of neural networks and regression models F. Tillema, K. M. van Zuilekom & M. F. A. M van Maarseveen Centre for Transport Studies, Civil Engineering, University of Twente, The

More information

AP Psych - Stat 1 Name Period Date. MULTIPLE CHOICE. Choose the one alternative that best completes the statement or answers the question.

AP Psych - Stat 1 Name Period Date. MULTIPLE CHOICE. Choose the one alternative that best completes the statement or answers the question. AP Psych - Stat 1 Name Period Date MULTIPLE CHOICE. Choose the one alternative that best completes the statement or answers the question. 1) In a set of incomes in which most people are in the $15,000

More information

Bayesian and Frequentist Approaches

Bayesian and Frequentist Approaches Bayesian and Frequentist Approaches G. Jogesh Babu Penn State University http://sites.stat.psu.edu/ babu http://astrostatistics.psu.edu All models are wrong But some are useful George E. P. Box (son-in-law

More information

Reveal Relationships in Categorical Data

Reveal Relationships in Categorical Data SPSS Categories 15.0 Specifications Reveal Relationships in Categorical Data Unleash the full potential of your data through perceptual mapping, optimal scaling, preference scaling, and dimension reduction

More information

NEW METHODS FOR SENSITIVITY TESTS OF EXPLOSIVE DEVICES

NEW METHODS FOR SENSITIVITY TESTS OF EXPLOSIVE DEVICES NEW METHODS FOR SENSITIVITY TESTS OF EXPLOSIVE DEVICES Amit Teller 1, David M. Steinberg 2, Lina Teper 1, Rotem Rozenblum 2, Liran Mendel 2, and Mordechai Jaeger 2 1 RAFAEL, POB 2250, Haifa, 3102102, Israel

More information

Ecological Statistics

Ecological Statistics A Primer of Ecological Statistics Second Edition Nicholas J. Gotelli University of Vermont Aaron M. Ellison Harvard Forest Sinauer Associates, Inc. Publishers Sunderland, Massachusetts U.S.A. Brief Contents

More information

PHARMACY BENEFITS MANAGER SELECTION FAQ FOR PRODUCERS

PHARMACY BENEFITS MANAGER SELECTION FAQ FOR PRODUCERS PHARMACY BENEFITS MANAGER SELECTION FAQ FOR PRODUCERS For Producer Audience Only - Please Do Not Distribute Regence has selected Prime Therapeutics as the Pharmacy benefits manager for its health plans.

More information

Reduce Tension by Making the Desired Choice Easier

Reduce Tension by Making the Desired Choice Easier Daniel Kahneman Talk at Social and Behavioral Sciences Meeting at OEOB Reduce Tension by Making the Desired Choice Easier Here is one of the best theoretical ideas that psychology has to offer developed

More information

International Journal of Computer Science Trends and Technology (IJCST) Volume 5 Issue 1, Jan Feb 2017

International Journal of Computer Science Trends and Technology (IJCST) Volume 5 Issue 1, Jan Feb 2017 RESEARCH ARTICLE Classification of Cancer Dataset in Data Mining Algorithms Using R Tool P.Dhivyapriya [1], Dr.S.Sivakumar [2] Research Scholar [1], Assistant professor [2] Department of Computer Science

More information

A Biostatistics Applications Area in the Department of Mathematics for a PhD/MSPH Degree

A Biostatistics Applications Area in the Department of Mathematics for a PhD/MSPH Degree A Biostatistics Applications Area in the Department of Mathematics for a PhD/MSPH Degree Patricia B. Cerrito Department of Mathematics Jewish Hospital Center for Advanced Medicine pcerrito@louisville.edu

More information

J2.6 Imputation of missing data with nonlinear relationships

J2.6 Imputation of missing data with nonlinear relationships Sixth Conference on Artificial Intelligence Applications to Environmental Science 88th AMS Annual Meeting, New Orleans, LA 20-24 January 2008 J2.6 Imputation of missing with nonlinear relationships Michael

More information

Sawtooth Software. MaxDiff Analysis: Simple Counting, Individual-Level Logit, and HB RESEARCH PAPER SERIES. Bryan Orme, Sawtooth Software, Inc.

Sawtooth Software. MaxDiff Analysis: Simple Counting, Individual-Level Logit, and HB RESEARCH PAPER SERIES. Bryan Orme, Sawtooth Software, Inc. Sawtooth Software RESEARCH PAPER SERIES MaxDiff Analysis: Simple Counting, Individual-Level Logit, and HB Bryan Orme, Sawtooth Software, Inc. Copyright 009, Sawtooth Software, Inc. 530 W. Fir St. Sequim,

More information

Modeling of Hippocampal Behavior

Modeling of Hippocampal Behavior Modeling of Hippocampal Behavior Diana Ponce-Morado, Venmathi Gunasekaran and Varsha Vijayan Abstract The hippocampus is identified as an important structure in the cerebral cortex of mammals for forming

More information

Biology Partnership (A Teacher Quality Grant) Lesson Plan Construction Form

Biology Partnership (A Teacher Quality Grant) Lesson Plan Construction Form Biology Partnership (A Teacher Quality Grant) Lesson Plan Construction Form Identifying Information: (Group Members and Schools, Title of Lesson, Length in Minutes, Course Level) Group Members: Marilyn

More information

Speech recognition in noisy environments: A survey

Speech recognition in noisy environments: A survey T-61.182 Robustness in Language and Speech Processing Speech recognition in noisy environments: A survey Yifan Gong presented by Tapani Raiko Feb 20, 2003 About the Paper Article published in Speech Communication

More information

Stepwise method Modern Model Selection Methods Quantile-Quantile plot and tests for normality

Stepwise method Modern Model Selection Methods Quantile-Quantile plot and tests for normality Week 9 Hour 3 Stepwise method Modern Model Selection Methods Quantile-Quantile plot and tests for normality Stat 302 Notes. Week 9, Hour 3, Page 1 / 39 Stepwise Now that we've introduced interactions,

More information

Run Time Tester Requirements Document

Run Time Tester Requirements Document Run Time Tester Requirements Document P. Sherwood 7 March 2004 Version: 0.4 Status: Draft After review 2 March 2004 Contents 1 Introduction 2 1.1 Scope of the Requirements Document.....................

More information

Cognitive Neuroscience History of Neural Networks in Artificial Intelligence The concept of neural network in artificial intelligence

Cognitive Neuroscience History of Neural Networks in Artificial Intelligence The concept of neural network in artificial intelligence Cognitive Neuroscience History of Neural Networks in Artificial Intelligence The concept of neural network in artificial intelligence To understand the network paradigm also requires examining the history

More information

Walgreen Co. Reports Second Quarter 2010 Earnings Per Diluted Share of 68 Cents; Results Include 2 Cents Per Diluted Share of Restructuring Costs

Walgreen Co. Reports Second Quarter 2010 Earnings Per Diluted Share of 68 Cents; Results Include 2 Cents Per Diluted Share of Restructuring Costs Walgreen Co. Reports Second Quarter 2010 Earnings Per Diluted Share of 68 Cents; Results Include 2 Cents Per Diluted Share of Restructuring Costs Second quarter sales up 3.1 percent to record $17.0 billion

More information

CRISP: Challenging the Standard Framework of Hippocampal Memory Function

CRISP: Challenging the Standard Framework of Hippocampal Memory Function CRISP: Challenging the Standard Framework of Hippocampal Memory Function Laurenz Wiskott Mehdi Bayati Sen Cheng Jan Melchior Torsten Neher supported by: Deutsche Forschungsgemeinschaft - Sonderforschungsbereich

More information

Model-Based fmri Analysis. Will Alexander Dept. of Experimental Psychology Ghent University

Model-Based fmri Analysis. Will Alexander Dept. of Experimental Psychology Ghent University Model-Based fmri Analysis Will Alexander Dept. of Experimental Psychology Ghent University Motivation Models (general) Why you ought to care Model-based fmri Models (specific) From model to analysis Extended

More information

Modeling Sentiment with Ridge Regression

Modeling Sentiment with Ridge Regression Modeling Sentiment with Ridge Regression Luke Segars 2/20/2012 The goal of this project was to generate a linear sentiment model for classifying Amazon book reviews according to their star rank. More generally,

More information

Supplementary Figure 1. Nature Neuroscience: doi: /nn.4547

Supplementary Figure 1. Nature Neuroscience: doi: /nn.4547 Supplementary Figure 1 Characterization of the Microfetti mouse model. (a) Gating strategy for 8-color flow analysis of peripheral Ly-6C + monocytes from Microfetti mice 5-7 days after TAM treatment. Living

More information

An Auditory System Modeling in Sound Source Localization

An Auditory System Modeling in Sound Source Localization An Auditory System Modeling in Sound Source Localization Yul Young Park The University of Texas at Austin EE381K Multidimensional Signal Processing May 18, 2005 Abstract Sound localization of the auditory

More information

Classıfıcatıon of Dıabetes Dısease Usıng Backpropagatıon and Radıal Basıs Functıon Network

Classıfıcatıon of Dıabetes Dısease Usıng Backpropagatıon and Radıal Basıs Functıon Network UTM Computing Proceedings Innovations in Computing Technology and Applications Volume 2 Year: 2017 ISBN: 978-967-0194-95-0 1 Classıfıcatıon of Dıabetes Dısease Usıng Backpropagatıon and Radıal Basıs Functıon

More information

Analysis of in-vivo extracellular recordings. Ryan Morrill Bootcamp 9/10/2014

Analysis of in-vivo extracellular recordings. Ryan Morrill Bootcamp 9/10/2014 Analysis of in-vivo extracellular recordings Ryan Morrill Bootcamp 9/10/2014 Goals for the lecture Be able to: Conceptually understand some of the analysis and jargon encountered in a typical (sensory)

More information

Bayesian hierarchical modelling

Bayesian hierarchical modelling Bayesian hierarchical modelling Matthew Schofield Department of Mathematics and Statistics, University of Otago Bayesian hierarchical modelling Slide 1 What is a statistical model? A statistical model:

More information

Prof. Greg Francis 7/31/15

Prof. Greg Francis 7/31/15 Brain scans PSY 200 Greg Francis Lecture 04 Scanning Brain scanning techniques like provide spatial and temporal patterns of activity across the brain We want to analyze those patterns to discover how

More information

Artificial Neural Networks

Artificial Neural Networks Artificial Neural Networks Torsten Reil torsten.reil@zoo.ox.ac.uk Outline What are Neural Networks? Biological Neural Networks ANN The basics Feed forward net Training Example Voice recognition Applications

More information

Multilayer Perceptron Neural Network Classification of Malignant Breast. Mass

Multilayer Perceptron Neural Network Classification of Malignant Breast. Mass Multilayer Perceptron Neural Network Classification of Malignant Breast Mass Joshua Henry 12/15/2017 henry7@wisc.edu Introduction Breast cancer is a very widespread problem; as such, it is likely that

More information

IT Foundation Application Map

IT Foundation Application Map IT Foundation Application Map IT vs. IS vs. CS IT = Information Technology CS = Computer Science IS = Information System IS Manage effective and efficient collaboration between CS and IT Industrial Engineering

More information

Bayesian Models for Combining Data Across Subjects and Studies in Predictive fmri Data Analysis

Bayesian Models for Combining Data Across Subjects and Studies in Predictive fmri Data Analysis Bayesian Models for Combining Data Across Subjects and Studies in Predictive fmri Data Analysis Thesis Proposal Indrayana Rustandi April 3, 2007 Outline Motivation and Thesis Preliminary results: Hierarchical

More information

Business Statistics Probability

Business Statistics Probability Business Statistics The following was provided by Dr. Suzanne Delaney, and is a comprehensive review of Business Statistics. The workshop instructor will provide relevant examples during the Skills Assessment

More information

Evaluating Classifiers for Disease Gene Discovery

Evaluating Classifiers for Disease Gene Discovery Evaluating Classifiers for Disease Gene Discovery Kino Coursey Lon Turnbull khc0021@unt.edu lt0013@unt.edu Abstract Identification of genes involved in human hereditary disease is an important bioinfomatics

More information

Consob Net Short Positions ( NSP ) Notification System. User Manual

Consob Net Short Positions ( NSP ) Notification System. User Manual Consob Net Short Positions ( NSP ) Notification System User Manual Contents 1. Registration of a new user... 2 2. Lost password regeneration... 3 3. Changing user details... 3 4. NSP management overview...

More information

Supplemental Information. Triangulating the Neural, Psychological, and Economic Bases of Guilt Aversion

Supplemental Information. Triangulating the Neural, Psychological, and Economic Bases of Guilt Aversion Neuron, Volume 70 Supplemental Information Triangulating the Neural, Psychological, and Economic Bases of Guilt Aversion Luke J. Chang, Alec Smith, Martin Dufwenberg, and Alan G. Sanfey Supplemental Information

More information

A Brain Computer Interface System For Auto Piloting Wheelchair

A Brain Computer Interface System For Auto Piloting Wheelchair A Brain Computer Interface System For Auto Piloting Wheelchair Reshmi G, N. Kumaravel & M. Sasikala Centre for Medical Electronics, Dept. of Electronics and Communication Engineering, College of Engineering,

More information

Analysis and Interpretation of Data Part 1

Analysis and Interpretation of Data Part 1 Analysis and Interpretation of Data Part 1 DATA ANALYSIS: PRELIMINARY STEPS 1. Editing Field Edit Completeness Legibility Comprehensibility Consistency Uniformity Central Office Edit 2. Coding Specifying

More information

An ECG Beat Classification Using Adaptive Neuro- Fuzzy Inference System

An ECG Beat Classification Using Adaptive Neuro- Fuzzy Inference System An ECG Beat Classification Using Adaptive Neuro- Fuzzy Inference System Pramod R. Bokde Department of Electronics Engineering, Priyadarshini Bhagwati College of Engineering, Nagpur, India Abstract Electrocardiography

More information

Predictive performance and discrimination in unbalanced classification

Predictive performance and discrimination in unbalanced classification MASTER Predictive performance and discrimination in unbalanced classification van der Zon, S.B. Award date: 2016 Link to publication Disclaimer This document contains a student thesis (bachelor's or master's),

More information

EECS 433 Statistical Pattern Recognition

EECS 433 Statistical Pattern Recognition EECS 433 Statistical Pattern Recognition Ying Wu Electrical Engineering and Computer Science Northwestern University Evanston, IL 60208 http://www.eecs.northwestern.edu/~yingwu 1 / 19 Outline What is Pattern

More information

HIV Modelling Consortium

HIV Modelling Consortium HIV Modelling Consortium Review of allocative efficiency tools Improving Efficiency in Health Conference Professor Timothy Hallett Ellen McRobie Aim & objectives of the HIV Modelling Consortium The HIV

More information

NEXT. Top 12 Myths about OEE

NEXT. Top 12 Myths about OEE NEXT Top 12 Myths about OEE TOP 12 MYTHS ABOUT OEE 2/10 Thousands of pages have been written about OEE (Overall Equipment Effectiveness). Some of it is clear and useful; much of it is not. Personally,

More information

International Journal of Research in Science and Technology. (IJRST) 2018, Vol. No. 8, Issue No. IV, Oct-Dec e-issn: , p-issn: X

International Journal of Research in Science and Technology. (IJRST) 2018, Vol. No. 8, Issue No. IV, Oct-Dec e-issn: , p-issn: X CLOUD FILE SHARING AND DATA SECURITY THREATS EXPLORING THE EMPLOYABILITY OF GRAPH-BASED UNSUPERVISED LEARNING IN DETECTING AND SAFEGUARDING CLOUD FILES Harshit Yadav Student, Bal Bharati Public School,

More information

Lesson 6 Learning II Anders Lyhne Christensen, D6.05, INTRODUCTION TO AUTONOMOUS MOBILE ROBOTS

Lesson 6 Learning II Anders Lyhne Christensen, D6.05, INTRODUCTION TO AUTONOMOUS MOBILE ROBOTS Lesson 6 Learning II Anders Lyhne Christensen, D6.05, anders.christensen@iscte.pt INTRODUCTION TO AUTONOMOUS MOBILE ROBOTS First: Quick Background in Neural Nets Some of earliest work in neural networks

More information

Review Questions in Introductory Knowledge... 37

Review Questions in Introductory Knowledge... 37 Table of Contents Preface..... 17 About the Authors... 19 How This Book is Organized... 20 Who Should Buy This Book?... 20 Where to Find Answers to Review Questions and Exercises... 20 How to Report Errata...

More information

Clinical Examples as Non-uniform Learning and Testing Sets

Clinical Examples as Non-uniform Learning and Testing Sets Clinical Examples as Non-uniform Learning and Testing Sets Piotr Augustyniak AGH University of Science and Technology, 3 Mickiewicza Ave. 3-9 Krakow, Poland august@agh.edu.pl Abstract. Clinical examples

More information

Introduction to Statistical Data Analysis I

Introduction to Statistical Data Analysis I Introduction to Statistical Data Analysis I JULY 2011 Afsaneh Yazdani Preface What is Statistics? Preface What is Statistics? Science of: designing studies or experiments, collecting data Summarizing/modeling/analyzing

More information

Advanced Audio Interface for Phonetic Speech. Recognition in a High Noise Environment

Advanced Audio Interface for Phonetic Speech. Recognition in a High Noise Environment DISTRIBUTION STATEMENT A Approved for Public Release Distribution Unlimited Advanced Audio Interface for Phonetic Speech Recognition in a High Noise Environment SBIR 99.1 TOPIC AF99-1Q3 PHASE I SUMMARY

More information

Neural Coding. Computing and the Brain. How Is Information Coded in Networks of Spiking Neurons?

Neural Coding. Computing and the Brain. How Is Information Coded in Networks of Spiking Neurons? Neural Coding Computing and the Brain How Is Information Coded in Networks of Spiking Neurons? Coding in spike (AP) sequences from individual neurons Coding in activity of a population of neurons Spring

More information

How to Create Better Performing Bayesian Networks: A Heuristic Approach for Variable Selection

How to Create Better Performing Bayesian Networks: A Heuristic Approach for Variable Selection How to Create Better Performing Bayesian Networks: A Heuristic Approach for Variable Selection Esma Nur Cinicioglu * and Gülseren Büyükuğur Istanbul University, School of Business, Quantitative Methods

More information

Chapter 3 Software Packages to Install How to Set Up Python Eclipse How to Set Up Eclipse... 42

Chapter 3 Software Packages to Install How to Set Up Python Eclipse How to Set Up Eclipse... 42 Table of Contents Preface..... 21 About the Authors... 23 Acknowledgments... 24 How This Book is Organized... 24 Who Should Buy This Book?... 24 Where to Find Answers to Review Questions and Exercises...

More information

Classification of EEG signals in an Object Recognition task

Classification of EEG signals in an Object Recognition task Classification of EEG signals in an Object Recognition task Iacob D. Rus, Paul Marc, Mihaela Dinsoreanu, Rodica Potolea Technical University of Cluj-Napoca Cluj-Napoca, Romania 1 rus_iacob23@yahoo.com,

More information

Field data reliability analysis of highly reliable item

Field data reliability analysis of highly reliable item Field data reliability analysis of highly reliable item David Valis University of Defence, Czech Republic david.valis@unob.cz Miroslav Koucky Technical University of Liberec, Czech Republic miroslav.koucky@tul.cz

More information

cloglog link function to transform the (population) hazard probability into a continuous

cloglog link function to transform the (population) hazard probability into a continuous Supplementary material. Discrete time event history analysis Hazard model details. In our discrete time event history analysis, we used the asymmetric cloglog link function to transform the (population)

More information

You can use this app to build a causal Bayesian network and experiment with inferences. We hope you ll find it interesting and helpful.

You can use this app to build a causal Bayesian network and experiment with inferences. We hope you ll find it interesting and helpful. icausalbayes USER MANUAL INTRODUCTION You can use this app to build a causal Bayesian network and experiment with inferences. We hope you ll find it interesting and helpful. We expect most of our users

More information