Search results for: vector division
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1556

Search results for: vector division

536 Multivariate Control Chart to Determine Efficiency Measurements in Industrial Processes

Authors: J. J. Vargas, N. Prieto, L. A. Toro

Abstract:

Control charts are commonly used to monitor processes involving either variable or attribute of quality characteristics and determining the control limits as a critical task for quality engineers to improve the processes. Nonetheless, in some applications it is necessary to include an estimation of efficiency. In this paper, the ability to define the efficiency of an industrial process was added to a control chart by means of incorporating a data envelopment analysis (DEA) approach. In depth, a Bayesian estimation was performed to calculate the posterior probability distribution of parameters as means and variance and covariance matrix. This technique allows to analyse the data set without the need of using the hypothetical large sample implied in the problem and to be treated as an approximation to the finite sample distribution. A rejection simulation method was carried out to generate random variables from the parameter functions. Each resulting vector was used by stochastic DEA model during several cycles for establishing the distribution of each efficiency measures for each DMU (decision making units). A control limit was calculated with model obtained and if a condition of a low level efficiency of DMU is presented, system efficiency is out of control. In the efficiency calculated a global optimum was reached, which ensures model reliability.

Keywords: data envelopment analysis, DEA, Multivariate control chart, rejection simulation method

Procedia PDF Downloads 370
535 Techno-Economic Analysis of Offshore Hybrid Energy Systems with Hydrogen Production

Authors: Anna Crivellari, Valerio Cozzani

Abstract:

Even though most of the electricity produced in the entire world still comes from fossil fuels, new policies are being implemented in order to promote a more sustainable use of energy sources. Offshore renewable resources have become increasingly attractive thanks to the huge entity of power potentially obtained. However, the intermittent nature of renewables often limits the capacity of the systems and creates mismatches between supply and demand. Hydrogen is foreseen to be a promising vector to store and transport large amounts of excess renewable power by using existing oil and gas infrastructure. In this work, an offshore hybrid energy system integrating wind energy conversion with hydrogen production was conceptually defined and applied to offshore gas platforms. A techno-economic analysis was performed by considering two different locations for the installation of the innovative power system, i.e., the North Sea and the Adriatic Sea. The water depth, the distance of the platform from the onshore gas grid, the hydrogen selling price and the green financial incentive were some of the main factors taken into account in the comparison. The results indicated that the use of well-defined indicators allows to capture specifically different cost and revenue features of the analyzed systems, as well as to evaluate their competitiveness in the actual and future energy market.

Keywords: cost analysis, energy efficiency assessment, hydrogen production, offshore wind energy

Procedia PDF Downloads 119
534 Design of a Real Time Heart Sounds Recognition System

Authors: Omer Abdalla Ishag, Magdi Baker Amien

Abstract:

Physicians used the stethoscope for listening patient heart sounds in order to make a diagnosis. However, the determination of heart conditions by acoustic stethoscope is a difficult task so it requires special training of medical staff. This study developed an accurate model for analyzing the phonocardiograph signal based on PC and DSP processor. The system has been realized into two phases; offline and real time phase. In offline phase, 30 cases of heart sounds files were collected from medical students and doctor's world website. For experimental phase (real time), an electronic stethoscope has been designed, implemented and recorded signals from 30 volunteers, 17 were normal cases and 13 were various pathologies cases, these acquired 30 signals were preprocessed using an adaptive filter to remove lung sounds. The background noise has been removed from both offline and real data, using wavelet transform, then graphical and statistics features vector elements were extracted, finally a look-up table was used for classification heart sounds cases. The obtained results of the implemented system showed accuracy of 90%, 80% and sensitivity of 87.5%, 82.4% for offline data, and real data respectively. The whole system has been designed on TMS320VC5509a DSP Platform.

Keywords: code composer studio, heart sounds, phonocardiograph, wavelet transform

Procedia PDF Downloads 436
533 Effects of Food Habits on Road Accidents Due to Micro-Sleepiness and Analysis of Attitudes to Develop a Food Product as a Preventive Measure

Authors: Rumesh Liyanage, S. B. Nawaratne, K. K. D. S. Ranaweera, Indira Wickramasinghe, K. G. S. C. Katukurunda

Abstract:

Study it was attempted to identify an effect of food habits and publics’ attitudes on micro-sleepiness and preventive measures to develop a food product to combat. Statistical data pertaining to road accidents were collected from, Sri Lanka Police Traffic Division and a pre-tested questionnaire was used to collect data from 250 respondents. They were selected representing drivers (especially highway drivers), private and public sector workers (shift based) and cramming students (university and school). Questionnaires were directed to fill independently and personally and collected data were analyzed statistically. Results revealed that 76.84, 96.39 and 80.93% out of total respondents consumed rice for all three meals which lead to ingesting higher glycemic meals. Taking two hyper glycemic meals before 14.00h was identified as a cause of micro-sleepiness within these respondents. Peak level of road accidents were observed at 14.00 - 20.00h (38.2%)and intensity of micro-sleepiness falls at the same time period (37.36%) while 14.00 to 16.00h was the peak time, 16.00 to 18.00h was the least; again 18.00 to 20.00h it reappears slightly. Even though respondents of the survey expressed that peak hours of micro- sleepiness is 14.00-16.00h, according to police reports, peak hours fall in between 18.00-20.00h. Out of the interviewees, 69.27% strongly wanted to avoid micro-sleepiness and intend to spend LKR 10-20 on a commercial product to combat micro sleepiness. As age-old practices to suppress micro-sleepiness are time taken, modern day respondents (51.64%) like to have a quick solution through a drink. Therefore, food habits of morning and noon may cause for micro- sleepiness while dinner may cause for both, natural and micro-sleepiness due to the heavy glycemic load of food. According to the study micro-sleepiness, can be categorized into three zones such as low-risk zone (08.00-10.00h and 18.00-20.00h), manageable zone (10.00-12.00h), and high- risk zone (14.00-16.00h).

Keywords: food habits, glycemic load, micro-sleepiness, road accidents

Procedia PDF Downloads 540
532 Constrains to Financial Engineering for Liquidity Management: A Multiple Case Study of Islamic Banks

Authors: Sadia Bibi, Karim Ullah

Abstract:

Islamic banks have excess liquidity, which needs proper management to earn a high rate of return on them to remain competitive. However, they lack assets-backed avenues and rely on a few sukuks, which led them to liquidity management issues. Financial engineering comes forward to innovate and develop instruments for the requisite financial problem. Still, they face many challenges, explored in the context of liquidity management in Islamic banks. The rigorous literature review shows that Shariah compliance, competition from the conventional banks, lack of sufficient instruments, derivatives are still not accepted as legitimate products, the inter-bank market being less developed, and no possibility of lender of last resort is the six significant constraints to financial engineering for liquidity management of Islamic banks. To further explore the problem, a multiple case study strategy is used to extend and develop the theory with the philosophical stance of social constructivism. Narrative in-depth interviews over the telephone are conducted with key personnel at treasury departments of selected banks. Data is segregated and displayed using NVivo 11 software, and the thematic analysis approach identifies themes related to the constraints. The exploration of further constraints to financial engineering for liquidity management of Islamic banks achieves the research aim. The theory is further developed by the addition of three more constraints to the theoretical framework, which are i) lack of skilled human resources, ii) lack of unified vision, and iii) lack of government support to the Islamic banks. These study findings are fruitful for the use of the government, regulatory authorities of the banking sector, the State Bank of Pakistan (Central Bank), and the product design & development division of Islamic banks to make the financial engineering process feasible and resolve liquidity management issues of Islamic banks.

Keywords: financial engineering, liquidity management, Islamic banks, shariah compliance

Procedia PDF Downloads 70
531 The Impact of E-Commerce on the Physical Space of Traditional Retail System

Authors: Sumayya S.

Abstract:

Making cities adaptive and inclusive is one among the inherent goal and challenge for contemporary cities. This is a serious concern when the urban transformations occur in varying magnitude due to visible and invisible factors. One type of visibly invisible factor is ecommerce and its expanding operation that is understood to cause changes to the conventional spatial structure positively and negatively. With the continued growth in e-commerce activities and its future potential, market analysts, media, and even retailers have questioned the importance of a future presence of traditional Brick-and-mortar stores in cities as a critical element, with some even referring to the repeated announcement of the closure of some store chains as the end of the online shopping era. Essentially this raises the question of how adaptive and inclusive the cities are to the dynamics of transformative changes that are often unseen. People have become more comfortable with seating inside and door delivery systems, and this increased change in usage of public spaces, especially the commercial corridors. Through this research helped in presetting a new approach for planning and designing commercial activities centers and also presents the impact of ecommerce on the urban fabric, such as division and fragmentation of space, showroom syndrome, reconceptualization of space, etc., in a critical way. The changes are understood by analyzing the e-commerce logistic process. Based on the inferences reach at the conclusion for the need of an integrated approach in the field of planning and designing of public spaces for the sustainable omnichannel retailing. This study was carried out with the following objectives Monitoring the impact of e commerce on the traditional shopping space. Explore the new challenges and opportunities faced by the urban form. Explore how adaptive and inclusive our cities are to the dynamics of transformative changes caused by ecommerce.

Keywords: E-commerce, shopping streets, online environment, offline environment, shopping factors

Procedia PDF Downloads 79
530 Pricing European Options under Jump Diffusion Models with Fast L-stable Padé Scheme

Authors: Salah Alrabeei, Mohammad Yousuf

Abstract:

The goal of option pricing theory is to help the investors to manage their money, enhance returns and control their financial future by theoretically valuing their options. Modeling option pricing by Black-School models with jumps guarantees to consider the market movement. However, only numerical methods can solve this model. Furthermore, not all the numerical methods are efficient to solve these models because they have nonsmoothing payoffs or discontinuous derivatives at the exercise price. In this paper, the exponential time differencing (ETD) method is applied for solving partial integrodifferential equations arising in pricing European options under Merton’s and Kou’s jump-diffusion models. Fast Fourier Transform (FFT) algorithm is used as a matrix-vector multiplication solver, which reduces the complexity from O(M2) into O(M logM). A partial fraction form of Pad`e schemes is used to overcome the complexity of inverting polynomial of matrices. These two tools guarantee to get efficient and accurate numerical solutions. We construct a parallel and easy to implement a version of the numerical scheme. Numerical experiments are given to show how fast and accurate is our scheme.

Keywords: Integral differential equations, , L-stable methods, pricing European options, Jump–diffusion model

Procedia PDF Downloads 143
529 The Biomechanical Analysis of Pelvic Osteotomies Applied for Developmental Dysplasia of the Hip Treatment in Pediatric Patients

Authors: Suvorov Vasyl, Filipchuk Viktor

Abstract:

Developmental Dysplasia of the Hip (DDH) is a frequent pathology in pediatric orthopedist’s practice. Neglected or residual cases of DDH in walking patients are usually treated using pelvic osteotomies. Plastic changes take place in hinge points due to acetabulum reorientation during surgery. Classically described hinge points and a traditional division of pelvic osteotomies on reshaping and reorientation are currently debated. The purpose of this article was to evaluate biomechanical changes during the most commonly used pelvic osteotomies (Salter, Dega, Pemberton) for DDH treatment in pediatric patients. Methods: virtual pelvic models of 2- and 6-years old patients were created, material properties were assigned, pelvic osteotomies were simulated and biomechanical changes were evaluated using finite element analysis (FEA). Results: it was revealed that the patient's age has an impact on pelvic bones and cartilages density (in younger patients the pelvic elements are more pliable - p<0.05). Stress distribution after each of the abovementioned pelvic osteotomy was assessed in 2- and 6-years old patients’ pelvic models; hinge points were evaluated. The new term "restriction point" was introduced, which means a place where restriction of acetabular deformity correction occurs. Pelvic ligaments attachment points were mainly these restriction points. Conclusions: it was found out that there are no purely reshaping and reorientation pelvic osteotomies as previously believed; the pelvic ring acts as a unit in carrying out the applied load. Biomechanical overload of triradiate cartilage during Salter osteotomy in 2-years old patient and in 2- and 6-years old patients during Pemberton osteotomy was revealed; overload of the posterior cortical layer in the greater sciatic notch in 2-years old patient during Dega osteotomy was revealed. Level of Evidence – Level IV, prognostic.

Keywords: developmental dysplasia of the hip, pelvic osteotomy, finite element analysis, hinge point, biomechanics

Procedia PDF Downloads 90
528 Cloning and Expression of Human Interleukin 15: A Promising Candidate for Cytokine Immunotherapy

Authors: Sadaf Ilyas

Abstract:

Recombinant cytokines have been employed successfully as potential therapeutic agent. Some cytokine therapies are already used as a part of clinical practice, ranging from early exploratory trials to well established therapies that have already received approval. Interleukin 15 is a pleiotropic cytokine having multiple roles in peripheral innate and adaptive immune cell function. It regulates the activation, proliferation and maturation of NK cells, T-cells, monocytes/macrophages and granulocytes, and the interactions between them thus acting as a bridge between innate and adaptive immune responses. Unraveling the biology of IL-15 has revealed some interesting surprises that may point toward some of the first therapeutic applications for this cytokine. In this study, the human interleukin 15 gene was isolated, amplified and ligated to a TA vector which was then transfected to a bacterial host, E. coli Top10F’. The sequence of cloned gene was confirmed and it showed 100% homology with the reported sequence. The confirmed gene was then subcloned in pET Expression system to study the IPTG induced expression of IL-15 gene. Positive expression was obtained for number of clones that showed 15 kd band of IL-15 in SDS-PAGE analysis, indicating the successful strain development that can be studied further to assess the potential therapeutic intervention of this cytokine in relevance to human diseases.

Keywords: Interleukin 15, pET expression system, immune therapy, protein purification

Procedia PDF Downloads 407
527 The Response of the Central Bank to the Exchange Rate Movement: A Dynamic Stochastic General Equilibrium-Vector Autoregressive Approach for Tunisian Economy

Authors: Abdelli Soulaima, Belhadj Besma

Abstract:

The paper examines the choice of the central bank toward the movements of the nominal exchange rate and evaluates its effects on the volatility of the output growth and the inflation. The novel hybrid method of the dynamic stochastic general equilibrium called the DSGE-VAR is proposed for analyzing this policy experiment in a small scale open economy in particular Tunisia. The contribution is provided to the empirical literature as we apply the Tunisian data with this model, which is rarely used in this context. Note additionally that the issue of treating the degree of response of the central bank to the exchange rate in Tunisia is special. To ameliorate the estimation, the Bayesian technique is carried out for the sample 1980:q1 to 2011 q4. Our results reveal that the central bank should not react or softly react to the exchange rate. The variance decomposition displayed that the overall inflation volatility is more pronounced with the fixed exchange rate regime for most of the shocks except for the productivity and the interest rate. The output volatility is also higher with this regime with the majority of the shocks exempting the foreign interest rate and the interest rate shocks.

Keywords: DSGE-VAR modeling, exchange rate, monetary policy, Bayesian estimation

Procedia PDF Downloads 292
526 Electroencephalogram Based Approach for Mental Stress Detection during Gameplay with Level Prediction

Authors: Priyadarsini Samal, Rajesh Singla

Abstract:

Many mobile games come with the benefits of entertainment by introducing stress to the human brain. In recognizing this mental stress, the brain-computer interface (BCI) plays an important role. It has various neuroimaging approaches which help in analyzing the brain signals. Electroencephalogram (EEG) is the most commonly used method among them as it is non-invasive, portable, and economical. Here, this paper investigates the pattern in brain signals when introduced with mental stress. Two healthy volunteers played a game whose aim was to search hidden words from the grid, and the levels were chosen randomly. The EEG signals during gameplay were recorded to investigate the impacts of stress with the changing levels from easy to medium to hard. A total of 16 features of EEG were analyzed for this experiment which includes power band features with relative powers, event-related desynchronization, along statistical features. Support vector machine was used as the classifier, which resulted in an accuracy of 93.9% for three-level stress analysis; for two levels, the accuracy of 92% and 98% are achieved. In addition to that, another game that was similar in nature was played by the volunteers. A suitable regression model was designed for prediction where the feature sets of the first and second game were used for testing and training purposes, respectively, and an accuracy of 73% was found.

Keywords: brain computer interface, electroencephalogram, regression model, stress, word search

Procedia PDF Downloads 182
525 Language Development and Growing Spanning Trees in Children Semantic Network

Authors: Somayeh Sadat Hashemi Kamangar, Fatemeh Bakouie, Shahriar Gharibzadeh

Abstract:

In this study, we target to exploit Maximum Spanning Trees (MST) of children's semantic networks to investigate their language development. To do so, we examine the graph-theoretic properties of word-embedding networks. The networks are made of words children learn prior to the age of 30 months as the nodes and the links which are built from the cosine vector similarity of words normatively acquired by children prior to two and a half years of age. These networks are weighted graphs and the strength of each link is determined by the numerical similarities of the two words (nodes) on the sides of the link. To avoid changing the weighted networks to the binaries by setting a threshold, constructing MSTs might present a solution. MST is a unique sub-graph that connects all the nodes in such a way that the sum of all the link weights is maximized without forming cycles. MSTs as the backbone of the semantic networks are suitable to examine developmental changes in semantic network topology in children. From these trees, several parameters were calculated to characterize the developmental change in network organization. We showed that MSTs provides an elegant method sensitive to capture subtle developmental changes in semantic network organization.

Keywords: maximum spanning trees, word-embedding, semantic networks, language development

Procedia PDF Downloads 137
524 Pure Scalar Equilibria for Normal-Form Games

Authors: Herbert W. Corley

Abstract:

A scalar equilibrium (SE) is an alternative type of equilibrium in pure strategies for an n-person normal-form game G. It is defined using optimization techniques to obtain a pure strategy for each player of G by maximizing an appropriate utility function over the acceptable joint actions. The players’ actions are determined by the choice of the utility function. Such a utility function could be agreed upon by the players or chosen by an arbitrator. An SE is an equilibrium since no players of G can increase the value of this utility function by changing their strategies. SEs are formally defined, and examples are given. In a greedy SE, the goal is to assign actions to the players giving them the largest individual payoffs jointly possible. In a weighted SE, each player is assigned weights modeling the degree to which he helps every player, including himself, achieve as large a payoff as jointly possible. In a compromise SE, each player wants a fair payoff for a reasonable interpretation of fairness. In a parity SE, the players want their payoffs to be as nearly equal as jointly possible. Finally, a satisficing SE achieves a personal target payoff value for each player. The vector payoffs associated with each of these SEs are shown to be Pareto optimal among all such acceptable vectors, as well as computationally tractable.

Keywords: compromise equilibrium, greedy equilibrium, normal-form game, parity equilibrium, pure strategies, satisficing equilibrium, scalar equilibria, utility function, weighted equilibrium

Procedia PDF Downloads 109
523 Instructional Leadership, Information and Communications Technology Competencies and Performance of Basic Education Teachers

Authors: Jay Martin L. Dionaldo

Abstract:

This study aimed to develop a causal model on the performance of the basic education teachers in the Division of Malaybalay City for the school year 2018-2019. This study used the responses of 300 randomly selected basic education teachers of Malaybalay City, Bukidnon. They responded to the three sets of questionnaires patterned from the National Education Association (2018) on instructional leadership of teachers, the questionnaire of Caluza et al., (2017) for information and communications technology competencies and the questionnaire on the teachers’ performance using the Individual Performance Commitment and Review Form (IPCRF) adopted by the Department of Education (DepEd). Descriptive statistics such as mean for the description, correlation for a relationship, regression for the extent influence, and path analysis for the model that best fits teachers’ performance were used. Result showed that basic education teachers have a very satisfactory level of performance. Also, the teachers highly practice instructional leadership practices in terms of coaching and mentoring, facilitating collaborative relationships, and community awareness and engagement. On the other hand, they are proficient users of ICT in terms of technology operations and concepts and basic users in terms of their pedagogical indicators. Furthermore, instructional leadership, coaching and mentoring, facilitating collaborative relationships and community awareness and engagement and information and communications technology competencies; technology operations and concept and pedagogy were significantly correlated toward teachers’ performance. Coaching and mentoring, community awareness and engagement, and technology operations and concept were the best predictors of teachers’ performance. The model that best fit teachers’ performance is anchored on coaching and mentoring of the teachers, embedded with facilitating collaborative relationships, community awareness, and engagement, technology operations, and concepts, and pedagogy.

Keywords: information and communications technology, instructional leadership, coaching and mentoring, collaborative relationship

Procedia PDF Downloads 113
522 Determination of Authorship of the Works Created by the Artificial Intelligence

Authors: Vladimir Sharapaev

Abstract:

This paper seeks to address the question of the authorship of copyrighted works created solely by the artificial intelligence or with the use thereof, and proposes possible interpretational or legislative solutions to the problems arising from the plurality of the persons potentially involved in the ultimate creation of the work and division of tasks among such persons. Being based on the commonly accepted assumption that a copyrighted work can only be created by a natural person, the paper does not deal with the issues regarding the creativity of the artificial intelligence per se (or the lack thereof), and instead focuses on the distribution of the intellectual property rights potentially belonging to the creators of the artificial intelligence and/or the creators of the content used for the formation of the copyrighted work. Moreover, the technical development and rapid improvement of the AI-based programmes, which tend to be reaching even greater independence on a human being, give rise to the question whether the initial creators of the artificial intelligence can be entitled to the intellectual property rights to the works created by such AI at all. As the juridical practice of some European courts and legal doctrine tends to incline to the latter opinion, indicating that the works created by the AI may not at all enjoy copyright protection, the questions of authorships appear to be causing great concerns among the investors in the development of the relevant technology. Although the technology companies dispose with further instruments of protection of their investments, the risk of the works in question not being copyrighted caused by the inconsistency of the case law and a certain research gap constitutes a highly important issue. In order to assess the possible interpretations, the author adopted a doctrinal and analytical approach to the research, systematically analysing the European and Czech copyright laws and case law in some EU jurisdictions. This study aims to contribute to greater legal certainty regarding the issues of the authorship of the AI-created works and define possible clues for further research.

Keywords: artificial intelligence, copyright, authorship, copyrighted work, intellectual property

Procedia PDF Downloads 116
521 Dual-functional Peptide With Defective Interfering Genes Protecting Mice From Avian and Seasonal Influenza Virus Infection

Authors: Hanjun Zhao

Abstract:

Limited efficacy of current antivirals and antiviral-resistant mutations impair anti-influenza treatment. Here, we evaluated the in vitro and in vivo antiviral effect of three defective interfering genes (DIG-3) of influenza virus. Virus replication was significantly reduced in 293T and A549 cells transfected with DIG-3. Mice transfected with DIG-3 encoded by jetPEI-vector, as prophylaxis and therapeutics against A(H7N7) virus respectively, had significantly better survivals (80% and 50%) than control mice (0%). We further developed a dual-functional peptide TAT-P1, which delivers DIG-3 with high transfection efficiency and concomitantly exerts antiviral activity by preventing endosomal acidification. TAT-P1/DIG-3 was more effective than jetPEI/DIG-3 in treating A(H7N7) or A(H1N1)pdm09-infected mice and showed potent prophylactic protection on A(H7N7) or A(H1N1)pdm09-infected mice. The addition of P1 peptide, preventing endosomal acidification, could enhance the protection of TAT-P1/DIG-3 on A(H1N1)pdm09-infected mice. Dual-functional TAT-P1 with DIG-3 can effectively protect or treat mice infected by avian and seasonal influenza virus infection.

Keywords: antiviral peptide, dual-functional peptide, defective interfering genes, influenza virus

Procedia PDF Downloads 116
520 Offline Signature Verification Using Minutiae and Curvature Orientation

Authors: Khaled Nagaty, Heba Nagaty, Gerard McKee

Abstract:

A signature is a behavioral biometric that is used for authenticating users in most financial and legal transactions. Signatures can be easily forged by skilled forgers. Therefore, it is essential to verify whether a signature is genuine or forged. The aim of any signature verification algorithm is to accommodate the differences between signatures of the same person and increase the ability to discriminate between signatures of different persons. This work presented in this paper proposes an automatic signature verification system to indicate whether a signature is genuine or not. The system comprises four phases: (1) The pre-processing phase in which image scaling, binarization, image rotation, dilation, thinning, and connecting ridge breaks are applied. (2) The feature extraction phase in which global and local features are extracted. The local features are minutiae points, curvature orientation, and curve plateau. The global features are signature area, signature aspect ratio, and Hu moments. (3) The post-processing phase, in which false minutiae are removed. (4) The classification phase in which features are enhanced before feeding it into the classifier. k-nearest neighbors and support vector machines are used. The classifier was trained on a benchmark dataset to compare the performance of the proposed offline signature verification system against the state-of-the-art. The accuracy of the proposed system is 92.3%.

Keywords: signature, ridge breaks, minutiae, orientation

Procedia PDF Downloads 141
519 EEG-Based Classification of Psychiatric Disorders: Bipolar Mood Disorder vs. Schizophrenia

Authors: Han-Jeong Hwang, Jae-Hyun Jo, Fatemeh Alimardani

Abstract:

An accurate diagnosis of psychiatric diseases is a challenging issue, in particular when distinct symptoms for different diseases are overlapped, such as delusions appeared in bipolar mood disorder (BMD) and schizophrenia (SCH). In the present study, we propose a useful way to discriminate BMD and SCH using electroencephalography (EEG). A total of thirty BMD and SCH patients (15 vs. 15) took part in our experiment. EEG signals were measured with nineteen electrodes attached on the scalp using the international 10-20 system, while they were exposed to a visual stimulus flickering at 16 Hz for 95 s. The flickering visual stimulus induces a certain brain signal, known as steady-state visual evoked potential (SSVEP), which is differently observed in patients with BMD and SCH, respectively, in terms of SSVEP amplitude because they process the same visual information in own unique way. For classifying BDM and SCH patients, machine learning technique was employed in which leave-one-out-cross validation was performed. The SSVEPs induced at the fundamental (16 Hz) and second harmonic (32 Hz) stimulation frequencies were extracted using fast Fourier transformation (FFT), and they were used as features. The most discriminative feature was selected using the Fisher score, and support vector machine (SVM) was used as a classifier. From the analysis, we could obtain a classification accuracy of 83.33 %, showing the feasibility of discriminating patients with BMD and SCH using EEG. We expect that our approach can be utilized for psychiatrists to more accurately diagnose the psychiatric disorders, BMD and SCH.

Keywords: bipolar mood disorder, electroencephalography, schizophrenia, machine learning

Procedia PDF Downloads 410
518 Performance of On-site Earthquake Early Warning Systems for Different Sensor Locations

Authors: Ting-Yu Hsu, Shyu-Yu Wu, Shieh-Kung Huang, Hung-Wei Chiang, Kung-Chun Lu, Pei-Yang Lin, Kuo-Liang Wen

Abstract:

Regional earthquake early warning (EEW) systems are not suitable for Taiwan, as most destructive seismic hazards arise due to in-land earthquakes. These likely cause the lead-time provided by regional EEW systems before a destructive earthquake wave arrives to become null. On the other hand, an on-site EEW system can provide more lead-time at a region closer to an epicenter, since only seismic information of the target site is required. Instead of leveraging the information of several stations, the on-site system extracts some P-wave features from the first few seconds of vertical ground acceleration of a single station and performs a prediction of the oncoming earthquake intensity at the same station according to these features. Since seismometers could be triggered by non-earthquake events such as a passing of a truck or other human activities, to reduce the likelihood of false alarms, a seismometer was installed at three different locations on the same site and the performance of the EEW system for these three sensor locations were discussed. The results show that the location on the ground of the first floor of a school building maybe a good choice, since the false alarms could be reduced and the cost for installation and maintenance is the lowest.

Keywords: earthquake early warning, on-site, seismometer location, support vector machine

Procedia PDF Downloads 239
517 Behind Egypt’s Financial Crisis: Dollarization

Authors: Layal Mansour

Abstract:

This paper breaks down Egypt’s financial crisis by constructing a customized financial stress index by including the vulnerable economic indicator “dollarization” as a vulnerable indicator in the credit and exchange sector. The Financial Stress Index for Egypt (FSIE) includes informative vulnerable indicators of the main financial sectors: the banking sector, the equities market, and the foreign exchange market. It is calculated on a monthly basis from 2010 to December 2022, so to report the two recent world’s most devastating financial crises: Covid 19 crisis and Ukraine-Russia War, in addition to the local 2016 and 2022 financial crises. We proceed first by a graphical analysis then by empirical analysis in running under Vector Autoregression (VAR) Model, dynamic causality tests between foreign reserves, dollarization rate, and FSIE. The graphical analysis shows that unexpectedly, Egypt’s economy seems to be immune to internal economic/political instabilities, however it is highly exposed to the foreign and exchange market. Empirical analysis confirms the graphical observations and proves that dollarization, or more precisely debt in foreign currency seems to be the main trigger of Egypt’s current financial crisis.

Keywords: egypt, financial crisis, financial stress index, dollarization, VAR model, causality tests

Procedia PDF Downloads 81
516 A Graph-Based Retrieval Model for Passage Search

Authors: Junjie Zhong, Kai Hong, Lei Wang

Abstract:

Passage Retrieval (PR) plays an important role in many Natural Language Processing (NLP) tasks. Traditional efficient retrieval models relying on exact term-matching, such as TF-IDF or BM25, have nowadays been exceeded by pre-trained language models which match by semantics. Though they gain effectiveness, deep language models often require large memory as well as time cost. To tackle the trade-off between efficiency and effectiveness in PR, this paper proposes Graph Passage Retriever (GraphPR), a graph-based model inspired by the development of graph learning techniques. Different from existing works, GraphPR is end-to-end and integrates both term-matching information and semantics. GraphPR constructs a passage-level graph from BM25 retrieval results and trains a GCN-like model on the graph with graph-based objectives. Passages were regarded as nodes in the constructed graph and were embedded in dense vectors. PR can then be implemented using embeddings and a fast vector-similarity search. Experiments on a variety of real-world retrieval datasets show that the proposed model outperforms related models in several evaluation metrics (e.g., mean reciprocal rank, accuracy, F1-scores) while maintaining a relatively low query latency and memory usage.

Keywords: efficiency, effectiveness, graph learning, language model, passage retrieval, term-matching model

Procedia PDF Downloads 130
515 Analysis of Translational Ship Oscillations in a Realistic Environment

Authors: Chen Zhang, Bernhard Schwarz-Röhr, Alexander Härting

Abstract:

To acquire accurate ship motions at the center of gravity, a single low-cost inertial sensor is utilized and applied on board to measure ship oscillating motions. As observations, the three axes accelerations and three axes rotational rates provided by the sensor are used. The mathematical model of processing the observation data includes determination of the distance vector between the sensor and the center of gravity in x, y, and z directions. After setting up the transfer matrix from sensor’s own coordinate system to the ship’s body frame, an extended Kalman filter is applied to deal with nonlinearities between the ship motion in the body frame and the observation information in the sensor’s frame. As a side effect, the method eliminates sensor noise and other unwanted errors. Results are not only roll and pitch, but also linear motions, in particular heave and surge at the center of gravity. For testing, we resort to measurements recorded on a small vessel in a well-defined sea state. With response amplitude operators computed numerically by a commercial software (Seaway), motion characteristics are estimated. These agree well with the measurements after processing with the suggested method.

Keywords: extended Kalman filter, nonlinear estimation, sea trial, ship motion estimation

Procedia PDF Downloads 519
514 Displaying of GnRH Peptides on Bacteriophage T7 and Its Immunogenicity in Mice Model

Authors: Hai Xu, Yiwei Wang, Xi Bao, Bihua Deng, Pengcheng Li, Yu Lu

Abstract:

T7 phage could be used as a perfect vector for peptides expression and haptens presentation. T7-3GnRH recombinant phage was constructed by inserting three copies of Gonadotrophin Releasing Hormone (GnRH) gene into the multiple cloning site of T7 Select 415-1b phage genome. The positive T7-3GnRH phage was selected by using polymerase chain reaction amplification, and the p10B-3GnRH fusion protein was verified by SDS-PAGE and Western-blotting assay. T7-3GnRH vaccine was made and immunized with 1010 pfu in 0.2 ml per dose in mice. Blood samples were collected at an interval in weeks, and anti-GnRH antibody and testosterone concentrations were detected by ELISA and radioimmunoassay, respectively. The results show that T7-3GnRH phage particles confer a high immunogenicity to the GnRH-derived epitope. Moreover, the T7-3GnRH vaccine induced higher level of anti-GnRH antibody than ImproVac®. However, the testosterone concentrations in both immunized groups were at a similar level, and the testis developments were significantly inhibited compared to controls. These findings demonstrated that the anti-GnRH antibody could neutralize the endogenous GnRH to down regulate testosterone level and limit testis development, highlighting the potential value of T7-3GnRH in the immunocastration vaccine research.

Keywords: Gonadotrophin Releasing Hormone (GnRH), Immunocastration, T7 phage, Phage vaccine

Procedia PDF Downloads 276
513 Extension and Closure of a Field for Engineering Purpose

Authors: Shouji Yujiro, Memei Dukovic, Mist Yakubu

Abstract:

Fields are important objects of study in algebra since they provide a useful generalization of many number systems, such as the rational numbers, real numbers, and complex numbers. In particular, the usual rules of associativity, commutativity and distributivity hold. Fields also appear in many other areas of mathematics; see the examples below. When abstract algebra was first being developed, the definition of a field usually did not include commutativity of multiplication, and what we today call a field would have been called either a commutative field or a rational domain. In contemporary usage, a field is always commutative. A structure which satisfies all the properties of a field except possibly for commutativity, is today called a division ring ordivision algebra or sometimes a skew field. Also non-commutative field is still widely used. In French, fields are called corps (literally, body), generally regardless of their commutativity. When necessary, a (commutative) field is called corps commutative and a skew field-corps gauche. The German word for body is Körper and this word is used to denote fields; hence the use of the blackboard bold to denote a field. The concept of fields was first (implicitly) used to prove that there is no general formula expressing in terms of radicals the roots of a polynomial with rational coefficients of degree 5 or higher. An extension of a field k is just a field K containing k as a subfield. One distinguishes between extensions having various qualities. For example, an extension K of a field k is called algebraic, if every element of K is a root of some polynomial with coefficients in k. Otherwise, the extension is called transcendental. The aim of Galois Theory is the study of algebraic extensions of a field. Given a field k, various kinds of closures of k may be introduced. For example, the algebraic closure, the separable closure, the cyclic closure et cetera. The idea is always the same: If P is a property of fields, then a P-closure of k is a field K containing k, having property, and which is minimal in the sense that no proper subfield of K that contains k has property P. For example if we take P (K) to be the property ‘every non-constant polynomial f in K[t] has a root in K’, then a P-closure of k is just an algebraic closure of k. In general, if P-closures exist for some property P and field k, they are all isomorphic. However, there is in general no preferable isomorphism between two closures.

Keywords: field theory, mechanic maths, supertech, rolltech

Procedia PDF Downloads 366
512 Dwindling the Stability of DNA Sequence by Base Substitution at Intersection of COMT and MIR4761 Gene

Authors: Srishty Gulati, Anju Singh, Shrikant Kukreti

Abstract:

The manifestation of structural polymorphism in DNA depends on the sequence and surrounding environment. Ample of folded DNA structures have been found in the cellular system out of which DNA hairpins are very common, however, are indispensable due to their role in the replication initiation sites, recombination, transcription regulation, and protein recognition. We enumerate this approach in our study, where the two base substitutions and change in temperature embark destabilization of DNA structure and misbalance the equilibrium between two structures of a sequence present at the overlapping region of the human COMT gene and MIR4761 gene. COMT and MIR4761 gene encodes for catechol-O-methyltransferase (COMT) enzyme and microRNAs (miRNAs), respectively. Environmental changes and errors during cell division lead to genetic abnormalities. The COMT gene entailed in dopamine regulation fosters neurological diseases like Parkinson's disease, schizophrenia, velocardiofacial syndrome, etc. A 19-mer deoxyoligonucleotide sequence 5'-AGGACAAGGTGTGCATGCC-3' (COMT19) is located at exon-4 on chromosome 22 and band q11.2 at the intersection of COMT and MIR4761 gene. Bioinformatics studies suggest that this sequence is conserved in humans and few other organisms and is involved in recognition of transcription factors in the vicinity of 3'-end. Non-denaturating gel electrophoresis and CD spectroscopy of COMT sequences indicate the formation of hairpin type DNA structures. Temperature-dependent CD studies revealed an unusual shift in the slipped DNA-Hairpin DNA equilibrium with the change in temperature. Also, UV-thermal melting techniques suggest that the two base substitutions on the complementary strand of COMT19 did not affect the structure but reduces the stability of duplex. This study gives insight about the possibility of existing structurally polymorphic transient states within DNA segments present at the intersection of COMT and MIR4761 gene.

Keywords: base-substitution, catechol-o-methyltransferase (COMT), hairpin-DNA, structural polymorphism

Procedia PDF Downloads 117
511 Development of a Sustainable Municipal Solid Waste Management for an Urban Area: Case Study from a Developing Country

Authors: Anil Kumar Gupta, Dronadula Venkata Sai Praneeth, Brajesh Dubey, Arundhuti Devi, Suravi Kalita, Khanindra Sharma

Abstract:

Increase in urbanization and industrialization have led to improve in the standard of living. However, at the same time, the challenges due to improper solid waste management are also increasing. Municipal Solid Waste management is considered as a vital step in the development of urban infrastructure. The present study focuses on developing a solid waste management plan for an urban area in a developing country. The current scenario of solid waste management practices at various urban bodies in India is summarized. Guwahati city in the northeastern part of the country and is also one of the targeted smart cities (under the governments Smart Cities program) was chosen as case study to develop and implement the solid waste management plan. The whole city was divided into various divisions and waste samples were collected according to American Society for Testing and Materials (ASTM) - D5231-92 - 2016 for each division in the city and a composite sample prepared to represent the waste from the entire city. The solid waste characterization in terms of physical and chemical which includes mainly proximate and ultimate analysis were carried out. Existing primary and secondary collection systems were studied and possibilities of enhancing the collection systems were discussed. The composition of solid waste for the overall city was found to be as: organic matters 38%, plastic 27%, paper + cardboard 15%, Textile 9%, inert 7% and others 4%. During the conference presentation, further characterization results in terms of Thermal gravimetric analysis (TGA), pH and water holding capacity will be discussed. The waste management options optimizing activities such as recycling, recovery, reuse and reduce will be presented and discussed.

Keywords: proximate, recycling, thermal gravimetric analysis (TGA), solid waste management

Procedia PDF Downloads 182
510 Predicting the Diagnosis of Alzheimer’s Disease: Development and Validation of Machine Learning Models

Authors: Jay L. Fu

Abstract:

Patients with Alzheimer's disease progressively lose their memory and thinking skills and, eventually, the ability to carry out simple daily tasks. The disease is irreversible, but early detection and treatment can slow down the disease progression. In this research, publicly available MRI data and demographic data from 373 MRI imaging sessions were utilized to build models to predict dementia. Various machine learning models, including logistic regression, k-nearest neighbor, support vector machine, random forest, and neural network, were developed. Data were divided into training and testing sets, where training sets were used to build the predictive model, and testing sets were used to assess the accuracy of prediction. Key risk factors were identified, and various models were compared to come forward with the best prediction model. Among these models, the random forest model appeared to be the best model with an accuracy of 90.34%. MMSE, nWBV, and gender were the three most important contributing factors to the detection of Alzheimer’s. Among all the models used, the percent in which at least 4 of the 5 models shared the same diagnosis for a testing input was 90.42%. These machine learning models allow early detection of Alzheimer’s with good accuracy, which ultimately leads to early treatment of these patients.

Keywords: Alzheimer's disease, clinical diagnosis, magnetic resonance imaging, machine learning prediction

Procedia PDF Downloads 138
509 Modeling Pan Evaporation Using Intelligent Methods of ANN, LSSVM and Tree Model M5 (Case Study: Shahroud and Mayamey Stations)

Authors: Hamidreza Ghazvinian, Khosro Ghazvinian, Touba Khodaiean

Abstract:

The importance of evaporation estimation in water resources and agricultural studies is undeniable. Pan evaporation are used as an indicator to determine the evaporation of lakes and reservoirs around the world due to the ease of interpreting its data. In this research, intelligent models were investigated in estimating pan evaporation on a daily basis. Shahroud and Mayamey were considered as the studied cities. These two cities are located in Semnan province in Iran. The mentioned cities have dry weather conditions that are susceptible to high evaporation potential. Meteorological data of 11 years of synoptic stations of Shahrood and Mayamey cities were used. The intelligent models used in this study are Artificial Neural Network (ANN), Least Squares Support Vector Machine (LSSVM), and M5 tree models. Meteorological parameters of minimum and maximum air temperature (Tmax, Tmin), wind speed (WS), sunshine hours (SH), air pressure (PA), relative humidity (RH) as selected input data and evaporation data from pan (EP) to The output data was considered. 70% of data is used at the education level, and 30 % of the data is used at the test level. Models used with explanation coefficient evaluation (R2) Root of Mean Squares Error (RMSE) and Mean Absolute Error (MAE). The results for the two Shahroud and Mayamey stations showed that the above three models' operations are rather appropriate.

Keywords: pan evaporation, intelligent methods, shahroud, mayamey

Procedia PDF Downloads 69
508 Development of Prediction Models of Day-Ahead Hourly Building Electricity Consumption and Peak Power Demand Using the Machine Learning Method

Authors: Dalin Si, Azizan Aziz, Bertrand Lasternas

Abstract:

To encourage building owners to purchase electricity at the wholesale market and reduce building peak demand, this study aims to develop models that predict day-ahead hourly electricity consumption and demand using artificial neural network (ANN) and support vector machine (SVM). All prediction models are built in Python, with tool Scikit-learn and Pybrain. The input data for both consumption and demand prediction are time stamp, outdoor dry bulb temperature, relative humidity, air handling unit (AHU), supply air temperature and solar radiation. Solar radiation, which is unavailable a day-ahead, is predicted at first, and then this estimation is used as an input to predict consumption and demand. Models to predict consumption and demand are trained in both SVM and ANN, and depend on cooling or heating, weekdays or weekends. The results show that ANN is the better option for both consumption and demand prediction. It can achieve 15.50% to 20.03% coefficient of variance of root mean square error (CVRMSE) for consumption prediction and 22.89% to 32.42% CVRMSE for demand prediction, respectively. To conclude, the presented models have potential to help building owners to purchase electricity at the wholesale market, but they are not robust when used in demand response control.

Keywords: building energy prediction, data mining, demand response, electricity market

Procedia PDF Downloads 312
507 Network Word Discovery Framework Based on Sentence Semantic Vector Similarity

Authors: Ganfeng Yu, Yuefeng Ma, Shanliang Yang

Abstract:

The word discovery is a key problem in text information retrieval technology. Methods in new word discovery tend to be closely related to words because they generally obtain new word results by analyzing words. With the popularity of social networks, individual netizens and online self-media have generated various network texts for the convenience of online life, including network words that are far from standard Chinese expression. How detect network words is one of the important goals in the field of text information retrieval today. In this paper, we integrate the word embedding model and clustering methods to propose a network word discovery framework based on sentence semantic similarity (S³-NWD) to detect network words effectively from the corpus. This framework constructs sentence semantic vectors through a distributed representation model, uses the similarity of sentence semantic vectors to determine the semantic relationship between sentences, and finally realizes network word discovery by the meaning of semantic replacement between sentences. The experiment verifies that the framework not only completes the rapid discovery of network words but also realizes the standard word meaning of the discovery of network words, which reflects the effectiveness of our work.

Keywords: text information retrieval, natural language processing, new word discovery, information extraction

Procedia PDF Downloads 88