Search results for: model estimation
13779 Weighted Rank Regression with Adaptive Penalty Function
Authors: Kang-Mo Jung
Abstract:
The use of regularization for statistical methods has become popular. The least absolute shrinkage and selection operator (LASSO) framework has become the standard tool for sparse regression. However, it is well known that the LASSO is sensitive to outliers or leverage points. We consider a new robust estimation which is composed of the weighted loss function of the pairwise difference of residuals and the adaptive penalty function regulating the tuning parameter for each variable. Rank regression is resistant to regression outliers, but not to leverage points. By adopting a weighted loss function, the proposed method is robust to leverage points of the predictor variable. Furthermore, the adaptive penalty function gives us good statistical properties in variable selection such as oracle property and consistency. We develop an efficient algorithm to compute the proposed estimator using basic functions in program R. We used an optimal tuning parameter based on the Bayesian information criterion (BIC). Numerical simulation shows that the proposed estimator is effective for analyzing real data set and contaminated data.Keywords: adaptive penalty function, robust penalized regression, variable selection, weighted rank regression
Procedia PDF Downloads 47613778 Assessing the Vulnerability Level in Coastal Communities in the Caribbean: A Case Study of San Pedro, Belize
Authors: Sherry Ann Ganase, Sandra Sookram
Abstract:
In this paper, the vulnerability level to climate change is analysed using a comprehensive index, consisting of five pillars: human, social, natural, physical, and financial. A structural equation model is also applied to determine the indicators and relationships that exist between the observed environmental changes and the quality of life. Using survey data to model the results, a value of 0.382 is derived as the vulnerability level for San Pedro, where values closer to zero indicates lower vulnerability and values closer to one indicates higher vulnerability. The results showed the social pillar to be most vulnerable, with the indicator ‘participation’ ranked the highest in its cohort. Although, the environmental pillar is ranked as least vulnerable, the indicators ‘hazard’ and ‘biodiversity’ obtained scores closer to 0.4, suggesting that changes in the environment are occurring from natural and anthropogenic activities. These changes can negatively influence the quality of life as illustrated in the structural equation modelling. The study concludes by reporting on the need for collective action and participation by households in lowering vulnerability to ensure sustainable development and livelihood.Keywords: climate change, participation, San Pedro, structural equation model, vulnerability index
Procedia PDF Downloads 63113777 A Particle Filter-Based Data Assimilation Method for Discrete Event Simulation
Authors: Zhi Zhu, Boquan Zhang, Tian Jing, Jingjing Li, Tao Wang
Abstract:
Data assimilation is a model and data hybrid-driven method that dynamically fuses new observation data with a numerical model to iteratively approach the real system state. It is widely used in state prediction and parameter inference of continuous systems. Because of the discrete event system’s non-linearity and non-Gaussianity, traditional Kalman Filter based on linear and Gaussian assumptions cannot perform data assimilation for such systems, so particle filter has gradually become a technical approach for discrete event simulation data assimilation. Hence, we proposed a particle filter-based discrete event simulation data assimilation method and took the unmanned aerial vehicle (UAV) maintenance service system as a proof of concept to conduct simulation experiments. The experimental results showed that the filtered state data is closer to the real state of the system, which verifies the effectiveness of the proposed method. This research can provide a reference framework for the data assimilation process of other complex nonlinear systems, such as discrete-time and agent simulation.Keywords: discrete event simulation, data assimilation, particle filter, model and data-driven
Procedia PDF Downloads 1413776 Logistic Model Tree and Expectation-Maximization for Pollen Recognition and Grouping
Authors: Endrick Barnacin, Jean-Luc Henry, Jack Molinié, Jimmy Nagau, Hélène Delatte, Gérard Lebreton
Abstract:
Palynology is a field of interest for many disciplines. It has multiple applications such as chronological dating, climatology, allergy treatment, and even honey characterization. Unfortunately, the analysis of a pollen slide is a complicated and time-consuming task that requires the intervention of experts in the field, which is becoming increasingly rare due to economic and social conditions. So, the automation of this task is a necessity. Pollen slides analysis is mainly a visual process as it is carried out with the naked eye. That is the reason why a primary method to automate palynology is the use of digital image processing. This method presents the lowest cost and has relatively good accuracy in pollen retrieval. In this work, we propose a system combining recognition and grouping of pollen. It consists of using a Logistic Model Tree to classify pollen already known by the proposed system while detecting any unknown species. Then, the unknown pollen species are divided using a cluster-based approach. Success rates for the recognition of known species have been achieved, and automated clustering seems to be a promising approach.Keywords: pollen recognition, logistic model tree, expectation-maximization, local binary pattern
Procedia PDF Downloads 18213775 Identification of Landslide Features Using Back-Propagation Neural Network on LiDAR Digital Elevation Model
Authors: Chia-Hao Chang, Geng-Gui Wang, Jee-Cheng Wu
Abstract:
The prediction of a landslide is a difficult task because it requires a detailed study of past activities using a complete range of investigative methods to determine the changing condition. In this research, first step, LiDAR 1-meter by 1-meter resolution of digital elevation model (DEM) was used to generate six environmental factors of landslide. Then, back-propagation neural networks (BPNN) was adopted to identify scarp, landslide areas and non-landslide areas. The BPNN uses 6 environmental factors in input layer and 1 output layer. Moreover, 6 landslide areas are used as training areas and 4 landslide areas as test areas in the BPNN. The hidden layer is set to be 1 and 2; the hidden layer neurons are set to be 4, 5, 6, 7 and 8; the learning rates are set to be 0.01, 0.1 and 0.5. When using 1 hidden layer with 7 neurons and the learning rate sets to be 0.5, the result of Network training root mean square error is 0.001388. Finally, evaluation of BPNN classification accuracy by the confusion matrix shows that the overall accuracy can reach 94.4%, and the Kappa value is 0.7464.Keywords: digital elevation model, DEM, environmental factors, back-propagation neural network, BPNN, LiDAR
Procedia PDF Downloads 14413774 Framing the Dynamics and Functioning of Different Variants of Terrorist Organizations: A Business Model Perspective
Authors: Eisa Younes Alblooshi
Abstract:
Counterterrorism strategies, to be effective and efficient, require a sound understanding of the dynamics, the interlinked organizational elements of the terrorist outfits being combated, with a view to having cognizance of their strong points to be guarded against, as well as the vulnerable zones that can be targeted for optimal results in a timely fashion by counterterrorism agencies. A unique model regarding the organizational imperatives was evolved in this research through likening the terrorist organizations with the traditional commercial ones, with a view to understanding in detail the dynamics of interconnectivity and dependencies, and the related compulsions facing the leaderships of such outfits that provide counterterrorism agencies with opportunities for forging better strategies. It involved assessing the evolving organizational dynamics and imperatives of different types of terrorist organizations, to enable the researcher to construct a prototype model that defines the progression and linkages of the related organizational elements of such organizations. It required detailed analysis of how the various elements are connected, with sequencing identified, as any outfit positions itself with respect to its external environment and internal dynamics. A case study focusing on a transnational radical religious state-sponsored terrorist organization was conducted to validate the research findings and to further strengthen the specific counterterrorism strategies. Six different variants of the business model of terrorist organizations were identified, categorized based on their outreach, mission, and status of any state sponsorship. The variants represent vast majority of the range of terrorist organizations acting locally or globally. The model shows the progression and dynamics of these organizations through various dimensions including mission, leadership, outreach, state sponsorship status, resulting in the organizational structure, state of autonomy, preference divergence in its fold, recruitment core, propagation avenues, down to their capacity to adapt, resulting critically in their own life cycles. A major advantage of the model is the utility of mapping terrorist organizations according to their fits to the sundry identified variants, allowing for flexibility and differences within, enabling the researchers and counterterrorism agencies to observe a neat blueprint of the organization’s footprint, along with highlighting the areas to be evaluated for focused target zone selection and timing of counterterrorism interventions. Special consideration is given to the dimension of financing, keeping in context the latest developments regarding cryptocurrencies, hawala, and global anti-money laundering initiatives. Specific counterterrorism strategies and intervention points have been identified for each of the respective model variants, with a view to efficient and effective deployment of resources.Keywords: terrorism, counterterrorism, model, strategy
Procedia PDF Downloads 15813773 Integration of Climatic Factors in the Meta-Population Modelling of the Dynamic of Malaria Transmission, Case of Douala and Yaoundé, Two Cities of Cameroon
Authors: Justin-Herve Noubissi, Jean Claude Kamgang, Eric Ramat, Januarius Asongu, Christophe Cambier
Abstract:
The goal of our study is to analyse the impact of climatic factors in malaria transmission taking into account migration between Douala and Yaoundé, two cities of Cameroon country. We show how variations of climatic factors such as temperature and relative humidity affect the malaria spread. We propose a meta-population model of the dynamic transmission of malaria that evolves in space and time and that takes into account temperature and relative humidity and the migration between Douala and Yaoundé. We also integrate the variation of environmental factors as events also called mathematical impulsion that can disrupt the model evolution at any time. Our modelling has been done using the Discrete EVents System Specification (DEVS) formalism. Our implementation has been done on Virtual Laboratory Environment (VLE) that uses DEVS formalism and abstract simulators for coupling models by integrating the concept of DEVS.Keywords: compartmental models, DEVS, discrete events, meta-population model, VLE
Procedia PDF Downloads 55413772 Detection of Image Blur and Its Restoration for Image Enhancement
Authors: M. V. Chidananda Murthy, M. Z. Kurian, H. S. Guruprasad
Abstract:
Image restoration in the process of communication is one of the emerging fields in the image processing. The motion analysis processing is the simplest case to detect motion in an image. Applications of motion analysis widely spread in many areas such as surveillance, remote sensing, film industry, navigation of autonomous vehicles, etc. The scene may contain multiple moving objects, by using motion analysis techniques the blur caused by the movement of the objects can be enhanced by filling-in occluded regions and reconstruction of transparent objects, and it also removes the motion blurring. This paper presents the design and comparison of various motion detection and enhancement filters. Median filter, Linear image deconvolution, Inverse filter, Pseudoinverse filter, Wiener filter, Lucy Richardson filter and Blind deconvolution filters are used to remove the blur. In this work, we have considered different types and different amount of blur for the analysis. Mean Square Error (MSE) and Peak Signal to Noise Ration (PSNR) are used to evaluate the performance of the filters. The designed system has been implemented in Matlab software and tested for synthetic and real-time images.Keywords: image enhancement, motion analysis, motion detection, motion estimation
Procedia PDF Downloads 28813771 Implementation of an Economic – Probabilistic Model to Risk Analysis of ERP Project in Technological Innovation Firms – A Case Study of ICT Industry in Iran
Authors: Reza Heidari, Maryam Amiri
Abstract:
In a technological world, many countries have a tendency to fortifying their companies and technological infrastructures. Also, one of the most important requirements for developing technology is innovation, and then, all companies are struggling to consider innovation as a basic principle. Since, the expansion of a product need to combine different technologies, therefore, different innovative projects would be run in the firms as a base of technology development. In such an environment, enterprise resource planning (ERP) has special significance in order to develop and strengthen of innovations. In this article, an economic-probabilistic analysis was provided to perform an implementation project of ERP in the technological innovation (TI) based firms. The used model in this article assesses simultaneously both risk and economic analysis in view of the probability of each event that is jointly between economical approach and risk investigation approach. To provide an economic-probabilistic analysis of risk of the project, activities and milestones in the cash flow were extracted. Also, probability of occurrence of each of them was assessed. Since, Resources planning in an innovative firm is the object of this project. Therefore, we extracted various risks that are in relation with innovative project and then they were evaluated in the form of cash flow. This model, by considering risks affecting the project and the probability of each of them and assign them to the project's cash flow categories, presents an adjusted cash flow based on Net Present Value (NPV) and with probabilistic simulation approach. Indeed, this model presented economic analysis of the project based on risks-adjusted. Then, it measures NPV of the project, by concerning that these risks which have the most effect on technological innovation projects, and in the following measures probability associated with the NPV for each category. As a result of application of presented model in the information and communication technology (ICT) industry, provided an appropriate analysis of feasibility of the project from the point of view of cash flow based on risk impact on the project. Obtained results can be given to decision makers until they can practically have a systematically analysis of the possibility of the project with an economic approach and as moderated.Keywords: cash flow categorization, economic evaluation, probabilistic, risk assessment, technological innovation
Procedia PDF Downloads 40413770 The Burden of Leptospirosis in Terms of Disability Adjusted Life Years in a District of Sri Lanka
Authors: A. M. U. P. Kumari, Vidanapathirana. J., Amarasekara J., Karunanayaka L.
Abstract:
Leptospirosis is a zoonotic infection with significant morbidity and mortality. As an occupational disease, it has become a global concern due to its disease burden in endemic countries and rural areas. The aim of this study was to assess disease burden in terms of DALYs of leptospirosis. A hospital-based descriptive cross-sectional study was conducted using 450 clinically diagnosed leptospirosis patients admitted to base and above hospitals in Monaragala district, Sri Lanka, using a pretested interviewer administered questionnaire. The patients were followed up till normal day today life after discharge. Estimation of DALYs was done using laboratory confirmed leptospirosis patients. Leptospirosis disease burden in the Monaragala district was 44.9 DALYs per 100,000 population which includes 33.18 YLLs and 10.9 YLDs. The incidence of leptospirosis in the Monaragala district during the study period was 59.8 per 100,000 population, and the case fatality rate (CFR) was 1.5% due to delay in health seeking behaviour; 75% of deaths were among males due to multi organ failure. The disease burden of leptospirosis in the Moneragala district was significantly high, and urgent efforts to control and prevent leptospirosis should be a priority.Keywords: human leptospirosis, disease burden, disability adjusted life Years, Sri Lanka
Procedia PDF Downloads 23413769 Prediction of Anticancer Potential of Curcumin Nanoparticles by Means of Quasi-Qsar Analysis Using Monte Carlo Method
Authors: Ruchika Goyal, Ashwani Kumar, Sandeep Jain
Abstract:
The experimental data for anticancer potential of curcumin nanoparticles was calculated by means of eclectic data. The optimal descriptors were examined using Monte Carlo method based CORAL SEA software. The statistical quality of the model is following: n = 14, R² = 0.6809, Q² = 0.5943, s = 0.175, MAE = 0.114, F = 26 (sub-training set), n =5, R²= 0.9529, Q² = 0.7982, s = 0.086, MAE = 0.068, F = 61, Av Rm² = 0.7601, ∆R²m = 0.0840, k = 0.9856 and kk = 1.0146 (test set) and n = 5, R² = 0.6075 (validation set). This data can be used to build predictive QSAR models for anticancer activity.Keywords: anticancer potential, curcumin, model, nanoparticles, optimal descriptors, QSAR
Procedia PDF Downloads 31813768 Towards a Computational Model of Consciousness: Global Abstraction Workspace
Authors: Halim Djerroud, Arab Ali Cherif
Abstract:
We assume that conscious functions are implemented automatically. In other words that consciousness as well as the non-consciousness aspect of human thought, planning, and perception, are produced by biologically adaptive algorithms. We propose that the mechanisms of consciousness can be produced using similar adaptive algorithms to those executed by the mechanism. In this paper, we propose a computational model of consciousness, the ”Global Abstraction Workspace” which is an internal environmental modelling perceived as a multi-agent system. This system is able to evolve and generate new data and processes as well as actions in the environment.Keywords: artificial consciousness, cognitive architecture, global abstraction workspace, multi-agent system
Procedia PDF Downloads 34013767 Establishment of a Nomogram Prediction Model for Postpartum Hemorrhage during Vaginal Delivery
Authors: Yinglisong, Jingge Chen, Jingxuan Chen, Yan Wang, Hui Huang, Jing Zhnag, Qianqian Zhang, Zhenzhen Zhang, Ji Zhang
Abstract:
Purpose: The study aims to establish a nomogram prediction model for postpartum hemorrhage (PPH) in vaginal delivery. Patients and Methods: Clinical data were retrospectively collected from vaginal delivery patients admitted to a hospital in Zhengzhou, China, from June 1, 2022 - October 31, 2022. Univariate and multivariate logistic regression were used to filter out independent risk factors. A nomogram model was established for PPH in vaginal delivery based on the risk factors coefficient. Bootstrapping was used for internal validation. To assess discrimination and calibration, receiver operator characteristics (ROC) and calibration curves were generated in the derivation and validation groups. Results: A total of 1340 cases of vaginal delivery were enrolled, with 81 (6.04%) having PPH. Logistic regression indicated that history of uterine surgery, induction of labor, duration of first labor, neonatal weight, WBC value (during the first stage of labor), and cervical lacerations were all independent risk factors of hemorrhage (P <0.05). The area-under-curve (AUC) of ROC curves of the derivation group and the validation group were 0.817 and 0.821, respectively, indicating good discrimination. Two calibration curves showed that nomogram prediction and practical results were highly consistent (P = 0.105, P = 0.113). Conclusion: The developed individualized risk prediction nomogram model can assist midwives in recognizing and diagnosing high-risk groups of PPH and initiating early warning to reduce PPH incidence.Keywords: vaginal delivery, postpartum hemorrhage, risk factor, nomogram
Procedia PDF Downloads 7713766 Socioterritorial Inequalities in a Region of Chile. Beyond the Geography
Authors: Javier Donoso-Bravo, Camila Cortés-Zambrano
Abstract:
In this paper, we analyze socioterritorial inequalities in the region of Valparaiso (Chile) using secondary data to account for these inequalities drawing on economic, social, educational, and environmental dimensions regarding the thirty-six municipalities of the region. We looked over a wide-ranging set of secondary data from public sources regarding economic activities, poverty, employment, income, years of education, post-secondary education access, green areas, access to potable water, and others. We found sharp socioterritorial inequalities especially based on the economic performance in each territory. Analysis show, on the one hand, the existence of a dual and unorganized development model in some territories with a strong economic activity -especially in the areas of finance, real estate, mining, and vineyards- but, at the same time, with poor social indicators. On the other hand, most of the territories show a dispersed model with very little dynamic economic activities and very poor social development. Finally, we discuss how socioterritorial inequalities in the region of Valparaiso reflect the level of globalization of the economic activities carried on in every territory.Keywords: socioterritorial inequalities, development model, Chile, secondary data, Region of Valparaiso
Procedia PDF Downloads 10113765 Improvement of Process Competitiveness Using Intelligent Reference Models
Authors: Julio Macedo
Abstract:
Several methodologies are now available to conceive the improvements of a process so that it becomes competitive as for example total quality, process reengineering, six sigma, define measure analysis improvement control method. These improvements are of different nature and can be external to the process represented by an optimization model or a discrete simulation model. In addition, the process stakeholders are several and have different desired performances for the process. Hence, the methodologies above do not have a tool to aid in the conception of the required improvements. In order to fill this void we suggest the use of intelligent reference models. A reference model is a set of qualitative differential equations and an objective function that minimizes the gap between the current and the desired performance indexes of the process. The reference models are intelligent so when they receive the current state of the problematic process and the desired performance indexes they generate the required improvements for the problematic process. The reference models are fuzzy cognitive maps added with an objective function and trained using the improvements implemented by the high performance firms. Experiments done in a set of students show the reference models allow them to conceive more improvements than students that do not use these models.Keywords: continuous improvement, fuzzy cognitive maps, process competitiveness, qualitative simulation, system dynamics
Procedia PDF Downloads 8713764 Development of a Mathematical Theoretical Model and Simulation of the Electromechanical System for Wave Energy Harvesting
Authors: P. Valdez, M. Pelissero, A. Haim, F. Muiño, F. Galia, R. Tula
Abstract:
As a result of the studies performed on the wave energy resource worldwide, a research project was set up to harvest wave energy for its conversion into electrical energy. Within this framework, a theoretical model of the electromechanical energy harvesting system, developed with MATLAB’s Simulink software, will be provided. This tool recreates the site conditions where the device will be installed and offers valuable information about the amount of energy that can be harnessed. This research provides a deeper understanding of the utilization of wave energy in order to improve the efficiency of a 1:1 scale prototype of the device.Keywords: electromechanical device, modeling, renewable energy, sea wave energy, simulation
Procedia PDF Downloads 48813763 A Fast, Reliable Technique for Face Recognition Based on Hidden Markov Model
Authors: Sameh Abaza, Mohamed Ibrahim, Tarek Mahmoud
Abstract:
Due to the development in the digital image processing, its wide use in many applications such as medical, security, and others, the need for more accurate techniques that are reliable, fast and robust is vehemently demanded. In the field of security, in particular, speed is of the essence. In this paper, a pattern recognition technique that is based on the use of Hidden Markov Model (HMM), K-means and the Sobel operator method is developed. The proposed technique is proved to be fast with respect to some other techniques that are investigated for comparison. Moreover, it shows its capability of recognizing the normal face (center part) as well as face boundary.Keywords: HMM, K-Means, Sobel, accuracy, face recognition
Procedia PDF Downloads 33213762 Transformer Fault Diagnostic Predicting Model Using Support Vector Machine with Gradient Decent Optimization
Authors: R. O. Osaseri, A. R. Usiobaifo
Abstract:
The power transformer which is responsible for the voltage transformation is of great relevance in the power system and oil-immerse transformer is widely used all over the world. A prompt and proper maintenance of the transformer is of utmost importance. The dissolved gasses content in power transformer, oil is of enormous importance in detecting incipient fault of the transformer. There is a need for accurate prediction of the incipient fault in transformer oil in order to facilitate the prompt maintenance and reducing the cost and error minimization. Study on fault prediction and diagnostic has been the center of many researchers and many previous works have been reported on the use of artificial intelligence to predict incipient failure of transformer faults. In this study machine learning technique was employed by using gradient decent algorithms and Support Vector Machine (SVM) in predicting incipient fault diagnosis of transformer. The method focuses on creating a system that improves its performance on previous result and historical data. The system design approach is basically in two phases; training and testing phase. The gradient decent algorithm is trained with a training dataset while the learned algorithm is applied to a set of new data. This two dataset is used to prove the accuracy of the proposed model. In this study a transformer fault diagnostic model based on Support Vector Machine (SVM) and gradient decent algorithms has been presented with a satisfactory diagnostic capability with high percentage in predicting incipient failure of transformer faults than existing diagnostic methods.Keywords: diagnostic model, gradient decent, machine learning, support vector machine (SVM), transformer fault
Procedia PDF Downloads 32213761 Effect of Damping on Performance of Magnetostrictive Vibration Energy Harvester
Authors: Mojtaba Ghodsi, Hamidreza Ziaifar, Morteza Mohammadzaheri, Payam Soltani
Abstract:
This article presents an analytical model to estimate the harvested power from a Magnetostrictive cantilevered beam with tip excitation. Furthermore, the effects of internal and external damping on harvested power are investigated. The magnetostrictive material in this harvester is Galfenol. In comparison to other popular smart materials like Terfenol-D, Galfenol has higher strength and machinability. In this article, first, a mechanical model of the Euler-Bernoulli beam is employed to calculate the deflection of the harvester. Then, the magneto-mechanical equation of Galfenol is combined with Faraday's law to calculate the generated voltage of the Magnetostrictive cantilevered beam harvester. Finally, the beam model is incorporated in the aforementioned combination. The results show that a 30×8.5×1 mm Galfenol cantilever beam harvester with 80 turn pickup coil can generate up to 3.7 mV and 9 mW. Furthermore, sensitivity analysis made by Response Surface Method (RSM) shows that the harvested power is only sensitive to the internal damping coefficient.Keywords: internal damping coefficient, external damping coefficient, euler-bernoulli, energy harvester, galfenol, magnetostrictive, response surface method
Procedia PDF Downloads 11313760 A Development Model of Factors Affecting Decision Making to Select Successor in Family Business of Thailand
Authors: Polvasut Mahaiamsiri, Piraphong Foosiri
Abstract:
The purpose of this research is to explore the model of factors affecting decision making to select successor in family business of Thailand. A Structural Equation Model (SEM) was created from relevant theories and researches. Consequently, examine and analyse, the causal relation factors of Succession Plan, Recruitment Process and Strategic Planning, whether they have direct or indirect effects on Decision Making to Select Successor in family business. Units of analysis are selected from the family business, totalling 300 sampling. Population sampling is current owners or CEO from the percentage of six district areas in Thailand with multi-stage sampling. A set of questionnaires is used to collect data. An analysis of structural equation modelling (SEM) technique using AMOS 21 program is conducted to test the hypotheses and confirmatory factor analysis is performed and shows that these variables can be tested. The finding of this study revealed that these factors are separate constructs that combine to determine decision making to select successors.Keywords: succession plan, family business, recruitment process, strategic planning, decision making to select successor
Procedia PDF Downloads 20813759 Predicting Success and Failure in Drug Development Using Text Analysis
Authors: Zhi Hao Chow, Cian Mulligan, Jack Walsh, Antonio Garzon Vico, Dimitar Krastev
Abstract:
Drug development is resource-intensive, time-consuming, and increasingly expensive with each developmental stage. The success rates of drug development are also relatively low, and the resources committed are wasted with each failed candidate. As such, a reliable method of predicting the success of drug development is in demand. The hypothesis was that some examples of failed drug candidates are pushed through developmental pipelines based on false confidence and may possess common linguistic features identifiable through sentiment analysis. Here, the concept of using text analysis to discover such features in research publications and investor reports as predictors of success was explored. R studios were used to perform text mining and lexicon-based sentiment analysis to identify affective phrases and determine their frequency in each document, then using SPSS to determine the relationship between our defined variables and the accuracy of predicting outcomes. A total of 161 publications were collected and categorised into 4 groups: (i) Cancer treatment, (ii) Neurodegenerative disease treatment, (iii) Vaccines, and (iv) Others (containing all other drugs that do not fit into the 3 categories). Text analysis was then performed on each document using 2 separate datasets (BING and AFINN) in R within the category of drugs to determine the frequency of positive or negative phrases in each document. A relative positivity and negativity value were then calculated by dividing the frequency of phrases with the word count of each document. Regression analysis was then performed with SPSS statistical software on each dataset (values from using BING or AFINN dataset during text analysis) using a random selection of 61 documents to construct a model. The remaining documents were then used to determine the predictive power of the models. Model constructed from BING predicts the outcome of drug performance in clinical trials with an overall percentage of 65.3%. AFINN model had a lower accuracy at predicting outcomes compared to the BING model at 62.5% but was not effective at predicting the failure of drugs in clinical trials. Overall, the study did not show significant efficacy of the model at predicting outcomes of drugs in development. Many improvements may need to be made to later iterations of the model to sufficiently increase the accuracy.Keywords: data analysis, drug development, sentiment analysis, text-mining
Procedia PDF Downloads 15813758 Copper Content in Daily Food Rations Planned and Served to Students from Selected Military Academies and Soldiers Doing Compulsory Military Service in the Polish Army
Authors: J. Bertrandt, A. Kłos, R. Waszkowski, T. Nowicki, R. Pytlak, E. Stęzycka, A. Gazdzinska
Abstract:
The aim of the work was estimation of copper intake with the daily food rations used for alimentation of students of military high schools and soldiers doing compulsory military service in the Polish Army. An average planned copper content in daily food rations used for alimentation of students and soldiers amounted to 2.49±0.35 mg, and 2.44±0.25 mg respectively. The copper content in the daily food ration given for consumption to students amounted from 1.81±0.14 mg to 2.58±0.44 mg while daily food rations served to soldiers delivered from 2.06±0.45 mg to 2.13±0.33 mg. The copper content in the rations planned for students and soldiers’ alimentation was within the limits of the norms obligatory in Poland. Daily food rations given for consumption, except rations served for students, were within the limits of the recommended norms, but food rations really eaten by examined men didn’t cover the requirements for copper.Keywords: copper, daily food ration, military service, food security, nutrition
Procedia PDF Downloads 27513757 Study of Seismic Damage Reinforced Concrete Frames in Variable Height with Logistic Statistic Function Distribution
Authors: P. Zarfam, M. Mansouri Baghbaderani
Abstract:
In seismic design, the proper reaction to the earthquake and the correct and accurate prediction of its subsequent effects on the structure are critical. Choose a proper probability distribution, which gives a more realistic probability of the structure's damage rate, is essential in damage discussions. With the development of design based on performance, analytical method of modal push over as an inexpensive, efficacious, and quick one in the estimation of the structures' seismic response is broadly used in engineering contexts. In this research three concrete frames of 3, 6, and 13 stories are analyzed in non-linear modal push over by 30 different earthquake records by OpenSEES software, then the detriment indexes of roof's displacement and relative displacement ratio of the stories are calculated by two parameters: peak ground acceleration and spectra acceleration. These indexes are used to establish the value of damage relations with log-normal distribution and logistics distribution. Finally the value of these relations is compared and the effect of height on the mentioned damage relations is studied, too.Keywords: modal pushover analysis, concrete structure, seismic damage, log-normal distribution, logistic distribution
Procedia PDF Downloads 24613756 Strabismus Detection Using Eye Alignment Stability
Authors: Anoop T. R., Otman Basir, Robert F. Hess, Ben Thompson
Abstract:
Strabismus refers to a misalignment of the eyes. Early detection and treatment of strabismus in childhood can prevent the development of permanent vision loss due to abnormal development of visual brain areas. Currently, many children with strabismus remain undiagnosed until school entry because current automated screening methods have limited success in the preschool age range. A method for strabismus detection using eye alignment stability (EAS) is proposed. This method starts with face detection, followed by facial landmark detection, eye region segmentation, eye gaze extraction, and eye alignment stability estimation. Binarization and morphological operations are performed for segmenting the pupil region from the eye. After finding the EAS, its absolute value is used to differentiate the strabismic eye from the non-strabismic eye. If the value of the eye alignment stability is greater than a particular threshold, then the eyes are misaligned, and if its value is less than the threshold, the eyes are aligned. The method was tested on 175 strabismic and non-strabismic images obtained from Kaggle and Google Photos. The strabismic eye is taken as a positive class, and the non-strabismic eye is taken as a negative class. The test produced a true positive rate of 100% and a false positive rate of 7.69%.Keywords: strabismus, face detection, facial landmarks, eye segmentation, eye gaze, binarization
Procedia PDF Downloads 7713755 The Pioneering Model in Teaching Arabic as a Mother Tongue through Modern Innovative Strategies
Authors: Rima Abu Jaber Bransi, Rawya Jarjoura Burbara
Abstract:
This study deals with two pioneering approaches in teaching Arabic as a mother tongue: first, computerization of literary and functional texts in the mother tongue; second, the pioneering model in teaching writing skills by computerization. The significance of the study lies in its treatment of a serious problem that is faced in the era of technology, which is the widening gap between the pupils and their mother tongue. The innovation in the study is that it introduces modern methods and tools and a pioneering instructional model that turns the process of mother tongue teaching into an effective, meaningful, interesting and motivating experience. In view of the Arabic language diglossia, standard Arabic and spoken Arabic, which constitutes a serious problem to the pupil in understanding unused words, and in order to bridge the gap between the pupils and their mother tongue, we resorted to computerized techniques; we took texts from the pre-Islamic period (Jahiliyya), starting with the Mu'allaqa of Imru' al-Qais and other selected functional texts and computerized them for teaching in an interesting way that saves time and effort, develops high thinking strategies, expands the literary good taste among the pupils, and gives the text added values that neither the book, the blackboard, the teacher nor the worksheets provide. On the other hand, we have developed a pioneering computerized model that aims to develop the pupil's ability to think, to provide his imagination with the elements of growth, invention and connection, and motivate him to be creative, and raise level of his scores and scholastic achievements. The model consists of four basic stages in teaching according to the following order: 1. The Preparatory stage, 2. The reading comprehension stage, 3. The writing stage, 4. The evaluation stage. Our lecture will introduce a detailed description of the model with illustrations and samples from the units that we built through highlighting some aspects of the uniqueness and innovation that are specific to this model and the different integrated tools and techniques that we developed. One of the most significant conclusions of this research is that teaching languages through the employment of new computerized strategies is very likely to get the Arabic speaking pupils out of the circle of passive reception into active and serious action and interaction. The study also emphasizes the argument that the computerized model of teaching can change the role of the pupil's mind from being a store of knowledge for a short time into a partner in producing knowledge and storing it in a coherent way that prevents its forgetfulness and keeping it in memory for a long period of time. Consequently, the learners also turn into partners in evaluation by expressing their views, giving their notes and observations, and application of the method of peer-teaching and learning.Keywords: classical poetry, computerization, diglossia, writing skill
Procedia PDF Downloads 22513754 Potential Contribution of Combined High-Resolution and Fluorescence Remote Sensing to Coastal Ecosystem Service Assessments
Authors: Yaner Yan, Ning Li, Yajun Qiao, Shuqing An
Abstract:
Although most studies have focused on assessing and mapping terrestrial ecosystem services, there is still a knowledge gap on coastal ecosystem services and an urgent need to assess them. Lau (2013) clearly defined five types of costal ecosystem services: carbon sequestration, shoreline protection, fish nursery, biodiversity, and water quality. While high-resolution remote sensing can provide the more direct, spatially estimates of biophysical parameters, such as species distribution relating to biodiversity service, and Fluorescence information derived from remote sensing direct relate to photosynthesis, availing in estimation of carbon sequestration and the response to environmental changes in coastal wetland. Here, we review the capabilities of high-resolution and fluorescence remote sesing for describing biodiversity, vegetation condition, ecological processes and highlight how these prodicts may contribute to costal ecosystem service assessment. In so doing, we anticipate rapid progress to combine the high-resolution and fluorescence remote sesing to estimate the spatial pattern of costal ecosystem services.Keywords: ecosystem services, high resolution, remote sensing, chlorophyll fluorescence
Procedia PDF Downloads 50713753 Performance Analysis of Elliptic Curve Cryptography Using Onion Routing to Enhance the Privacy and Anonymity in Grid Computing
Authors: H. Parveen Begam, M. A. Maluk Mohamed
Abstract:
Grid computing is an environment that allows sharing and coordinated use of diverse resources in dynamic, heterogeneous and distributed environment using Virtual Organization (VO). Security is a critical issue due to the open nature of the wireless channels in the grid computing which requires three fundamental services: authentication, authorization, and encryption. The privacy and anonymity are considered as an important factor while communicating over publicly spanned network like web. To ensure a high level of security we explored an extension of onion routing, which has been used with dynamic token exchange along with protection of privacy and anonymity of individual identity. To improve the performance of encrypting the layers, the elliptic curve cryptography is used. Compared to traditional cryptosystems like RSA (Rivest-Shamir-Adelman), ECC (Elliptic Curve Cryptosystem) offers equivalent security with smaller key sizes which result in faster computations, lower power consumption, as well as memory and bandwidth savings. This paper presents the estimation of the performance improvements of onion routing using ECC as well as the comparison graph between performance level of RSA and ECC.Keywords: grid computing, privacy, anonymity, onion routing, ECC, RSA
Procedia PDF Downloads 39813752 Factors Influencing the Adoption of Social Media as a Medium of Public Service Broadcasting
Authors: Seyed Mohammadbagher Jafari, Izmeera Shiham, Masoud Arianfar
Abstract:
The increased usage of Social media for different uses in turn makes it important to develop an understanding of users and their attitudes toward these sites, and moreover, the uses of such sites in a broader perspective such as broadcasting. This quantitative study addressed the problem of factors influencing the adoption of social media as a medium of public service broadcasting in the Republic of Maldives. These powerful and increasingly usable tools, accompanied by large public social media datasets, are bringing in a golden age of social science by empowering researchers to measure social behavior on a scale never before possible. This was conducted by exploring social responses on the use of social media. Research model was developed based on the previous models such as TAM, DOI and Trust combined model. It evaluates the influence of perceived ease of use, perceived usefulness, trust, complexity, compatibility and relative advantage influence on the adoption of social Media. The model was tested on a sample of 365 Maldivian people using survey method via questionnaire. The result showed that perceived usefulness, trust, relative advantage and complexity would highly influence the adoption of social media.Keywords: adoption, broadcasting, maldives, social media
Procedia PDF Downloads 48313751 Age–Related Changes of the Sella Turcica Morphometry in Adults Older Than 20-25 Years
Authors: Yu. I. Pigolkin, M. A. Garcia Corro
Abstract:
Age determination of unknown dead bodies in forensic personal identification is a complicated process which involves the application of numerous methods and techniques. Skeletal remains are less exposed to influences of environmental factors. In order to enhance the accuracy of forensic age estimation additional properties of bones correlating with age are required to be revealed. Material and Methods: Dimensional examination of the sella turcica was carried out on cadavers with the cranium opened by a circular vibrating saw. The sample consisted of a total of 90 Russian subjects, ranging in age from two months and 87 years. Results: The tendency of dimensional variations throughout life was detected. There were no observed gender differences in the morphometry of the sella turcica. The shared use of the sella turcica depth and length values revealed the possibility to categorize an examined sample in a certain age period. Conclusions: Based on the results of existing methods of age determination, the morphometry of the sella turcica can be an additional characteristic, amplifying the received values, and accordingly, increasing the accuracy of forensic biological age diagnosis.Keywords: age–related changes in bone structures, forensic personal identification, sella turcica morphometry, body identification
Procedia PDF Downloads 27513750 A Sparse Representation Speech Denoising Method Based on Adapted Stopping Residue Error
Authors: Qianhua He, Weili Zhou, Aiwu Chen
Abstract:
A sparse representation speech denoising method based on adapted stopping residue error was presented in this paper. Firstly, the cross-correlation between the clean speech spectrum and the noise spectrum was analyzed, and an estimation method was proposed. In the denoising method, an over-complete dictionary of the clean speech power spectrum was learned with the K-singular value decomposition (K-SVD) algorithm. In the sparse representation stage, the stopping residue error was adaptively achieved according to the estimated cross-correlation and the adjusted noise spectrum, and the orthogonal matching pursuit (OMP) approach was applied to reconstruct the clean speech spectrum from the noisy speech. Finally, the clean speech was re-synthesised via the inverse Fourier transform with the reconstructed speech spectrum and the noisy speech phase. The experiment results show that the proposed method outperforms the conventional methods in terms of subjective and objective measure.Keywords: speech denoising, sparse representation, k-singular value decomposition, orthogonal matching pursuit
Procedia PDF Downloads 499