Search results for: harmony search algorithms
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3820

Search results for: harmony search algorithms

2800 Implications of Learning Resource Centre in a Web Environment

Authors: Darshana Lal, Sonu Rana

Abstract:

Learning Resource Centers (LRC) are acquiring different kinds of documents like books, journals, thesis, dissertations, standard, databases etc. in print and e-form. This article deals with the different types of sources available in LRC. It also discusses the concept of the web, as a tool, as a multimedia system and the different interfaces available on the web. The reasons for establishing LRC are highlighted along with the assignments of LRC. Different features of LRC‘S like self-learning and group learning are described. It also implements a group of activities like reading, learning, educational etc. The use of LRC by students and faculties are given and concluded with the benefits.

Keywords: internet, search engine, resource centre, opac, self-learning, group learning

Procedia PDF Downloads 364
2799 Comparative Evaluation of the Effectiveness of Different Mindfulness-Based Interventions on Medically Unexplained Symptoms: A Systematic Review

Authors: R. R. Billones, N. Lukkahatai, L. N. Saligan

Abstract:

Mindfulness based interventions (MBIs) have been used in medically unexplained symptoms (MUS). This systematic review describes the literature investigating the general effect of MBIs on MUS and identifies the effects of specific MBIs on specific MUS conditions. The preferred reporting items for systematic reviews and meta-analysis guidelines (PRISMA) and the modified Oxford quality scoring system (JADAD) were applied to the review, yielding an initial 1,556 articles. The search engines included PubMed, ScienceDirect, Web of Science, Scopus, EMBASE, and PsychINFO using the search terms: mindfulness, or mediations, or mindful or MBCT or MBSR and medically unexplained symptoms or MUS or fibromyalgia or FMS. A total of 24 articles were included in the final systematic review. MBIs showed large effects on socialization skills for chronic fatigue syndrome (d=0.65), anger in fibromyalgia (d=0.61), improvement of somatic symptoms (d=1.6) and sleep (d=1.12) for painful conditions, physical health for chronic back pain (d=0.51), and disease intensity for irritable bowel disease/syndrome (d=1.13). A manualized MBI that applies the four fundamental elements present in all types of interventions were critical to efficacy. These elements were psycho-education sessions specific to better understand the medical symptoms, the practice of awareness, the non-judgmental observance of the experience at the moment, and the compassion to ones’ self. The effectiveness of different mindfulness interventions necessitates giving attention to improve the gaps that were identified related to home-based practice monitoring, competency training of mindfulness teachers, and sound psychometric properties to measure the mindfulness practice.

Keywords: mindfulness-based interventions, medically unexplained symptoms, mindfulness-based cognitive therapy, mindfulness-based stress reduction, fibromyalgia, irritable bowel syndrome

Procedia PDF Downloads 129
2798 The Use of Software and Internet Search Engines to Develop the Encoding and Decoding Skills of a Dyslexic Learner: A Case Study

Authors: Rabih Joseph Nabhan

Abstract:

This case study explores the impact of two major computer software programs Learn to Speak English and Learn English Spelling and Pronunciation, and some Internet search engines such as Google on mending the decoding and spelling deficiency of Simon X, a dyslexic student. The improvement in decoding and spelling may result in better reading comprehension and composition writing. Some computer programs and Internet materials can help regain the missing awareness and consequently restore his self-confidence and self-esteem. In addition, this study provides a systematic plan comprising a set of activities (four computer programs and Internet materials) which address the problem from the lowest to the highest levels of phoneme and phonological awareness. Four methods of data collection (accounts, observations, published tests, and interviews) create the triangulation to validly and reliably collect data before the plan, during the plan, and after the plan. The data collected are analyzed quantitatively and qualitatively. Sometimes the analysis is either quantitative or qualitative, and some other times a combination of both. Tables and figures are utilized to provide a clear and uncomplicated illustration of some data. The improvement in the decoding, spelling, reading comprehension, and composition writing skills that occurred is proved through the use of authentic materials performed by the student under study. Such materials are a comparison between two sample passages written by the learner before and after the plan, a genuine computer chat conversation, and the scores of the academic year that followed the execution of the plan. Based on these results, the researcher recommends further studies on other Lebanese dyslexic learners using the computer to mend their language problem in order to design and make a most reliable software program that can address this disability more efficiently and successfully.

Keywords: analysis, awareness, dyslexic, software

Procedia PDF Downloads 205
2797 The Benefits of End-To-End Integrated Planning from the Mine to Client Supply for Minimizing Penalties

Authors: G. Martino, F. Silva, E. Marchal

Abstract:

The control over delivered iron ore blend characteristics is one of the most important aspects of the mining business. The iron ore price is a function of its composition, which is the outcome of the beneficiation process. So, end-to-end integrated planning of mine operations can reduce risks of penalties on the iron ore price. In a standard iron mining company, the production chain is composed of mining, ore beneficiation, and client supply. When mine planning and client supply decisions are made uncoordinated, the beneficiation plant struggles to deliver the best blend possible. Technological improvements in several fields allowed bridging the gap between departments and boosting integrated decision-making processes. Clusterization and classification algorithms over historical production data generate reasonable previsions for quality and volume of iron ore produced for each pile of run-of-mine (ROM) processed. Mathematical modeling can use those deterministic relations to propose iron ore blends that better-fit specifications within a delivery schedule. Additionally, a model capable of representing the whole production chain can clearly compare the overall impact of different decisions in the process. This study shows how flexibilization combined with a planning optimization model between the mine and the ore beneficiation processes can reduce risks of out of specification deliveries. The model capabilities are illustrated on a hypothetical iron ore mine with magnetic separation process. Finally, this study shows ways of cost reduction or profit increase by optimizing process indicators across the production chain and integrating the different plannings with the sales decisions.

Keywords: clusterization and classification algorithms, integrated planning, mathematical modeling, optimization, penalty minimization

Procedia PDF Downloads 113
2796 Peril´s Environment of Energetic Infrastructure Complex System, Modelling by the Crisis Situation Algorithms

Authors: Jiří F. Urbánek, Alena Oulehlová, Hana Malachová, Jiří J. Urbánek Jr.

Abstract:

Crisis situations investigation and modelling are introduced and made within the complex system of energetic critical infrastructure, operating on peril´s environments. Every crisis situations and perils has an origin in the emergency/ crisis event occurrence and they need critical/ crisis interfaces assessment. Here, the emergency events can be expected - then crisis scenarios can be pre-prepared by pertinent organizational crisis management authorities towards their coping; or it may be unexpected - without pre-prepared scenario of event. But the both need operational coping by means of crisis management as well. The operation, forms, characteristics, behaviour and utilization of crisis management have various qualities, depending on real critical infrastructure organization perils, and prevention training processes. An aim is always - better security and continuity of the organization, which successful obtainment needs to find and investigate critical/ crisis zones and functions in critical infrastructure organization models, operating in pertinent perils environment. Our DYVELOP (Dynamic Vector Logistics of Processes) method is disposables for it. Here, it is necessary to derive and create identification algorithm of critical/ crisis interfaces. The locations of critical/ crisis interfaces are the flags of crisis situation in organization of critical infrastructure models. Then, the model of crisis situation will be displayed at real organization of Czech energetic crisis infrastructure subject in real peril environment. These efficient measures are necessary for the infrastructure protection. They will be derived for peril mitigation, crisis situation coping and for environmentally friendly organization survival, continuity and its sustainable development advanced possibilities.

Keywords: algorithms, energetic infrastructure complex system, modelling, peril´s environment

Procedia PDF Downloads 388
2795 Numerical Iteration Method to Find New Formulas for Nonlinear Equations

Authors: Kholod Mohammad Abualnaja

Abstract:

A new algorithm is presented to find some new iterative methods for solving nonlinear equations F(x)=0 by using the variational iteration method. The efficiency of the considered method is illustrated by example. The results show that the proposed iteration technique, without linearization or small perturbation, is very effective and convenient.

Keywords: variational iteration method, nonlinear equations, Lagrange multiplier, algorithms

Procedia PDF Downloads 524
2794 Association Rules Mining Task Using Metaheuristics: Review

Authors: Abir Derouiche, Abdesslem Layeb

Abstract:

Association Rule Mining (ARM) is one of the most popular data mining tasks and it is widely used in various areas. The search for association rules is an NP-complete problem that is why metaheuristics have been widely used to solve it. The present paper presents the ARM as an optimization problem and surveys the proposed approaches in the literature based on metaheuristics.

Keywords: Optimization, Metaheuristics, Data Mining, Association rules Mining

Procedia PDF Downloads 146
2793 Development of Academic Software for Medial Axis Determination of Porous Media from High-Resolution X-Ray Microtomography Data

Authors: S. Jurado, E. Pazmino

Abstract:

Determination of the medial axis of a porous media sample is a non-trivial problem of interest for several disciplines, e.g., hydrology, fluid dynamics, contaminant transport, filtration, oil extraction, etc. However, the computational tools available for researchers are limited and restricted. The primary aim of this work was to develop a series of algorithms to extract porosity, medial axis structure, and pore-throat size distributions from porous media domains. A complementary objective was to provide the algorithms as free computational software available to the academic community comprising researchers and students interested in 3D data processing. The burn algorithm was tested on porous media data obtained from High-Resolution X-Ray Microtomography (HRXMT) and idealized computer-generated domains. The real data and idealized domains were discretized in voxels domains of 550³ elements and binarized to denote solid and void regions to determine porosity. Subsequently, the algorithm identifies the layer of void voxels next to the solid boundaries. An iterative process removes or 'burns' void voxels in sequence of layer by layer until all the void space is characterized. Multiples strategies were tested to optimize the execution time and use of computer memory, i.e., segmentation of the overall domain in subdomains, vectorization of operations, and extraction of single burn layer data during the iterative process. The medial axis determination was conducted identifying regions where burnt layers collide. The final medial axis structure was refined to avoid concave-grain effects and utilized to determine the pore throat size distribution. A graphic user interface software was developed to encompass all these algorithms, including the generation of idealized porous media domains. The software allows input of HRXMT data to calculate porosity, medial axis, and pore-throat size distribution and provide output in tabular and graphical formats. Preliminary tests of the software developed during this study achieved medial axis, pore-throat size distribution and porosity determination of 100³, 320³ and 550³ voxel porous media domains in 2, 22, and 45 minutes, respectively in a personal computer (Intel i7 processor, 16Gb RAM). These results indicate that the software is a practical and accessible tool in postprocessing HRXMT data for the academic community.

Keywords: medial axis, pore-throat distribution, porosity, porous media

Procedia PDF Downloads 101
2792 Observing Upin and Ipin Animation Roles in Early Childhood Education

Authors: Juhanita Jiman

Abstract:

Malaysia is a unique country with multifaceted society; rich with its beautiful cultural values. It has been a long assimilation process for Malaysia to emerge its national identity. Malaysian government has been working hard for centuries to keep its people together in harmony. Cultural identity is identified to be ‘container’ that brings Malaysians together. The uniqueness of Malaysian cultures can actually be exploited for the benefit of the nation. However, this unique culture is somehow being threatened by those imported foreign values. If not closely monitored, these foreign influences can bring more damages than good. This paper aims to study elements in Upin and Ipin animation series and investigate how this series could help to educate local children with good moral and behaviour without being too serious and sententious. Upin and Ipin is chosen as a study to investigate the effectiveness of animation as a media of communication to promote positive values amongst pre-school children. Purposive sampling method was employed to determine the sample of studies hence pre-school children from Putrajaya Presint 9(2) school were chosen to take part in this study. The findings of this study demonstrate positive suggestions on how animation programmes being shown on TV can play significant roles in children social development and inculcate good moral behaviour as well as social skills among children and people around them.

Keywords: animation characters, children informal education, foreign influences, moral values

Procedia PDF Downloads 163
2791 A Graph Library Development Based on the Service-‎Oriented Architecture: Used for Representation of the ‎Biological ‎Systems in the Computer Algorithms

Authors: Mehrshad Khosraviani, Sepehr Najjarpour

Abstract:

Considering the usage of graph-based approaches in systems and synthetic biology, and the various types of ‎the graphs employed by them, a comprehensive graph library based ‎on the three-tier architecture (3TA) was previously introduced for full representation of the biological systems. Although proposing a 3TA-based graph library, three following reasons motivated us to redesign the graph ‎library based on the service-oriented architecture (SOA): (1) Maintaining the accuracy of the data related to an input graph (including its edges, its ‎vertices, its topology, etc.) without involving the end user:‎ Since, in the case of using 3TA, the library files are available to the end users, they may ‎be utilized incorrectly, and consequently, the invalid graph data will be provided to the ‎computer algorithms. However, considering the usage of the SOA, the operation of the ‎graph registration is specified as a service by encapsulation of the library files. In other words, overall control operations needed for registration of the valid data will be the ‎responsibility of the services. (2) Partitioning of the library product into some different parts: Considering 3TA, a whole library product was provided in general. While here, the product ‎can be divided into smaller ones, such as an AND/OR graph drawing service, and each ‎one can be provided individually. As a result, the end user will be able to select any ‎parts of the library product, instead of all features, to add it to a project. (3) Reduction of the complexities: While using 3TA, several other libraries must be needed to add for connecting to the ‎database, responsibility of the provision of the needed library resources in the SOA-‎based graph library is entrusted with the services by themselves. Therefore, the end user ‎who wants to use the graph library is not involved with its complexity. In the end, in order to ‎make ‎the library easier to control in the system, and to restrict the end user from accessing the files, ‎it was preferred to use the service-oriented ‎architecture ‎‎(SOA) over the three-tier architecture (3TA) and to redevelop the previously proposed graph library based on it‎.

Keywords: Bio-Design Automation, Biological System, Graph Library, Service-Oriented Architecture, Systems and Synthetic Biology

Procedia PDF Downloads 295
2790 Geoeducation Strategies for Teaching Natural Hazards in Schools

Authors: Carlos Alberto Ríos Reyes, Andrés Felipe Mejía Durán, Oscar Mauricio Castellanos Alarcón

Abstract:

There is no doubt of great importance to make it known that planet Earth is an entity in constant change and transformation; processes such as construction and destruction are part of the evolution of the territory. Geoeducation workshops represent a significant contribution to the search for educational projects focused on teaching relevant geoscience topics to make natural threats known in schools through recreational and didactic activities. This initiative represents an educational alternative that must be developed with the participation of primary and secondary schools, universities, and local communities. The methodology is based on several phases, which include: diagnosis to know the best teaching method for basic concepts and establish a starting point for the topics to be taught, as well as to identify areas and concepts that need to be reinforced and/or deepened; design of activities that involve all students regardless of their ability or level; use of accessible materials and experimentation to support clear and concise explanations for all students; adaptation of the teaching-learning process to individual needs; sensitization about natural threats; and evaluation and feedback. It is expected to offer a series of activities and materials as a significant contribution to the search for educational projects focused on teaching relevant geoscientific topics such as natural threats associated with earthquakes, volcanic eruptions, floods, landslides, etc. The major findings of this study are the pedagogical strategies that primary and secondary school teachers can appropriate to face the challenge of transferring geological knowledge and to advise decision-makers and citizens on the importance of geosciences for daily life. We conclude that the knowledge of the natural threats to our planet is very important to contribute to mitigating their risk.

Keywords: workshops, geoeducation, curriculum, geosciences, natural threats

Procedia PDF Downloads 55
2789 A National Systematic Review on Determining Prevalence of Mobbing Exposure in Turkish Nurses

Authors: Betül Sönmez, Aytolan Yıldırım

Abstract:

Objective: This systematic review aims to methodically analyze studies regarding mobbing behavior prevalence, individuals performing this behavior and the effects of mobbing on Turkish nurses. Background: Worldwide reports on mobbing cases have increased in the past years, a similar trend also observable in Turkey. It has been demonstrated that among healthcare workers, mobbing is significantly widespread in nurses. The number of studies carried out in this regard has also increased. Method: The main criteria for choosing articles in this systematic review were nurses located in Turkey, regardless of any specific date. In November 2014, a search using the keywords 'mobbing, bullying, psychological terror/violence, emotional violence, nurses, healthcare workers, Turkey' in PubMed, Science Direct, Ebscohost, National Thesis Centre database and Google search engine led to 71 studies in this field. 33 studies were not met the inclusion criteria specified for this study. Results: The findings were obtained using the results of 38 studies carried out in the past 13 years in Turkey, a large sample consisting of 8,877 nurses. Analysis of the incidences of mobbing behavior revealed a broad spectrum, ranging from none-slight experiences to 100% experiences. The most frequently observed mobbing behaviors include attacking personality, blocking communication and attacking professional and social reputation. Victims mostly experienced mobbing from their managers, the most common consequence of these actions being psychological effects. Conclusions: The results of studies with various scales indicate exposure of nurses to similar mobbing behavior. The high frequency of exposure of nurses to mobbing behavior in such a large sample highlights the importance of considering this issue in terms of individual and institutional consequences that adversely affect the performance of nurses.

Keywords: mobbing, bullying, workplace violence, nurses, Turkey

Procedia PDF Downloads 261
2788 Machine Learning in Patent Law: How Genetic Breeding Algorithms Challenge Modern Patent Law Regimes

Authors: Stefan Papastefanou

Abstract:

Artificial intelligence (AI) is an interdisciplinary field of computer science with the aim of creating intelligent machine behavior. Early approaches to AI have been configured to operate in very constrained environments where the behavior of the AI system was previously determined by formal rules. Knowledge was presented as a set of rules that allowed the AI system to determine the results for specific problems; as a structure of if-else rules that could be traversed to find a solution to a particular problem or question. However, such rule-based systems typically have not been able to generalize beyond the knowledge provided. All over the world and especially in IT-heavy industries such as the United States, the European Union, Singapore, and China, machine learning has developed to be an immense asset, and its applications are becoming more and more significant. It has to be examined how such products of machine learning models can and should be protected by IP law and for the purpose of this paper patent law specifically, since it is the IP law regime closest to technical inventions and computing methods in technical applications. Genetic breeding models are currently less popular than recursive neural network method and deep learning, but this approach can be more easily described by referring to the evolution of natural organisms, and with increasing computational power; the genetic breeding method as a subset of the evolutionary algorithms models is expected to be regaining popularity. The research method focuses on patentability (according to the world’s most significant patent law regimes such as China, Singapore, the European Union, and the United States) of AI inventions and machine learning. Questions of the technical nature of the problem to be solved, the inventive step as such, and the question of the state of the art and the associated obviousness of the solution arise in the current patenting processes. Most importantly, and the key focus of this paper is the problem of patenting inventions that themselves are developed through machine learning. The inventor of a patent application must be a natural person or a group of persons according to the current legal situation in most patent law regimes. In order to be considered an 'inventor', a person must actually have developed part of the inventive concept. The mere application of machine learning or an AI algorithm to a particular problem should not be construed as the algorithm that contributes to a part of the inventive concept. However, when machine learning or the AI algorithm has contributed to a part of the inventive concept, there is currently a lack of clarity regarding the ownership of artificially created inventions. Since not only all European patent law regimes but also the Chinese and Singaporean patent law approaches include identical terms, this paper ultimately offers a comparative analysis of the most relevant patent law regimes.

Keywords: algorithms, inventor, genetic breeding models, machine learning, patentability

Procedia PDF Downloads 99
2787 A Systematic Review on Prevalence, Serotypes and Antibiotic Resistance of Salmonella in Ethiopia

Authors: Atsebaha Gebrekidan Kahsay, Tsehaye Asmelash, Enquebaher Kassaye

Abstract:

Background: Salmonella remains a global public health problem with a significant burden in sub-Saharan African countries. Human restricted cause of typhoid and paratyphoid fever are S. Typhi and S. Paratyphi, whereas S. Enteritidis and S. Typhimurium is the causative agent of invasive nontyphoidal diseases among humans and animals are their reservoir. The antibiotic resistance of Salmonella is another public health threat around the globe. To come up with full information about human and animal salmonellosis, we made a systematic review of the prevalence, serotypes, and antibiotic resistance of Salmonella in Ethiopia. Methods: This systematic review used Google Scholar and PubMed search engines to search articles from Ethiopia that were published in English in peer-reviewed international journals from 2010 to 2022. We used keywords to identify the intended research articles and used a Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) checklist to ensure the inclusion and exclusion criteria. Frequencies and percentages were analyzed using Microsoft Excel. Results: Two hundred seven published articles were searched, and 43 were selected for a systematic review, human (28) and animals (15). The prevalence of Salmonella in humans and animals was 434 (5.2%) and 641(10.1%), respectively. Fourteen serotypes were identified from animals, and S. Typhimurium was among the top five. Among the ciprofloxacin-resistant isolates in human studies, 16.7% was the highest, whereas, for ceftriaxone, 100% resistance was reported. Conclusions: The prevalence of Salmonella among diarrheic patients and food handlers (5.2%) was lower than the prevalence in food animals (10.1%). We did not find serotypes of Salmonella in human studies, although fourteen serotypes were included in food-animal studies, and S. Typhimurium was among the top five. Salmonella species from some human studies revealed a non-susceptibility to ceftriaxone. We recommend further study about invasive nontyphoidal Salmonella and predisposing factors among humans and animals in Ethiopia.

Keywords: antibiotic resistance, prevalence, systematic review, serotypes, Salmonella, Ethiopia

Procedia PDF Downloads 64
2786 Predicting Football Player Performance: Integrating Data Visualization and Machine Learning

Authors: Saahith M. S., Sivakami R.

Abstract:

In the realm of football analytics, particularly focusing on predicting football player performance, the ability to forecast player success accurately is of paramount importance for teams, managers, and fans. This study introduces an elaborate examination of predicting football player performance through the integration of data visualization methods and machine learning algorithms. The research entails the compilation of an extensive dataset comprising player attributes, conducting data preprocessing, feature selection, model selection, and model training to construct predictive models. The analysis within this study will involve delving into feature significance using methodologies like Select Best and Recursive Feature Elimination (RFE) to pinpoint pertinent attributes for predicting player performance. Various machine learning algorithms, including Random Forest, Decision Tree, Linear Regression, Support Vector Regression (SVR), and Artificial Neural Networks (ANN), will be explored to develop predictive models. The evaluation of each model's performance utilizing metrics such as Mean Squared Error (MSE) and R-squared will be executed to gauge their efficacy in predicting player performance. Furthermore, this investigation will encompass a top player analysis to recognize the top-performing players based on the anticipated overall performance scores. Nationality analysis will entail scrutinizing the player distribution based on nationality and investigating potential correlations between nationality and player performance. Positional analysis will concentrate on examining the player distribution across various positions and assessing the average performance of players in each position. Age analysis will evaluate the influence of age on player performance and identify any discernible trends or patterns associated with player age groups. The primary objective is to predict a football player's overall performance accurately based on their individual attributes, leveraging data-driven insights to enrich the comprehension of player success on the field. By amalgamating data visualization and machine learning methodologies, the aim is to furnish valuable tools for teams, managers, and fans to effectively analyze and forecast player performance. This research contributes to the progression of sports analytics by showcasing the potential of machine learning in predicting football player performance and offering actionable insights for diverse stakeholders in the football industry.

Keywords: football analytics, player performance prediction, data visualization, machine learning algorithms, random forest, decision tree, linear regression, support vector regression, artificial neural networks, model evaluation, top player analysis, nationality analysis, positional analysis

Procedia PDF Downloads 24
2785 The Location-Routing Problem with Pickup Facilities and Heterogeneous Demand: Formulation and Heuristics Approach

Authors: Mao Zhaofang, Xu Yida, Fang Kan, Fu Enyuan, Zhao Zhao

Abstract:

Nowadays, last-mile distribution plays an increasingly important role in the whole industrial chain delivery link and accounts for a large proportion of the whole distribution process cost. Promoting the upgrading of logistics networks and improving the layout of final distribution points has become one of the trends in the development of modern logistics. Due to the discrete and heterogeneous needs and spatial distribution of customer demand, which will lead to a higher delivery failure rate and lower vehicle utilization, last-mile delivery has become a time-consuming and uncertain process. As a result, courier companies have introduced a range of innovative parcel storage facilities, including pick-up points and lockers. The introduction of pick-up points and lockers has not only improved the users’ experience but has also helped logistics and courier companies achieve large-scale economy. Against the backdrop of the COVID-19 of the previous period, contactless delivery has become a new hotspot, which has also created new opportunities for the development of collection services. Therefore, a key issue for logistics companies is how to design/redesign their last-mile distribution network systems to create integrated logistics and distribution networks that consider pick-up points and lockers. This paper focuses on the introduction of self-pickup facilities in new logistics and distribution scenarios and the heterogeneous demands of customers. In this paper, we consider two types of demand, including ordinary products and refrigerated products, as well as corresponding transportation vehicles. We consider the constraints associated with self-pickup points and lockers and then address the location-routing problem with self-pickup facilities and heterogeneous demands (LRP-PFHD). To solve this challenging problem, we propose a mixed integer linear programming (MILP) model that aims to minimize the total cost, which includes the facility opening cost, the variable transport cost, and the fixed transport cost. Due to the NP-hardness of the problem, we propose a hybrid adaptive large-neighbourhood search algorithm to solve LRP-PFHD. We evaluate the effectiveness and efficiency of the proposed algorithm by using instances generated based on benchmark instances. The results demonstrate that the hybrid adaptive large neighbourhood search algorithm is more efficient than MILP solvers such as Gurobi for LRP-PFHD, especially for large-scale instances. In addition, we made a comprehensive analysis of some important parameters (e.g., facility opening cost and transportation cost) to explore their impacts on the results and suggested helpful managerial insights for courier companies.

Keywords: city logistics, last-mile delivery, location-routing, adaptive large neighborhood search

Procedia PDF Downloads 62
2784 A Critical Review and Bibliometric Analysis on Measures of Achievement Motivation

Authors: Kanupriya Rawat, Aleksandra Błachnio, Paweł Izdebski

Abstract:

Achievement motivation, which drives a person to strive for success, is an important construct in sports psychology. This systematic review aims to analyze the methods of measuring achievement motivation used in previous studies published over the past four decades and to find out which method of measuring achievement motivation is the most prevalent and the most effective by thoroughly examining measures of achievement motivation used in each study and by evaluating most highly cited achievement motivation measures in sport. In order to understand this latent construct, thorough measurement is necessary, hence a critical evaluation of measurement tools is required. The literature search was conducted in the following databases: EBSCO, MEDLINE, APA PsychARTICLES, Academic Search Ultimate, Open Dissertations, ERIC, Science direct, Web of Science, as well as Wiley Online Library. A total of 26 articles met the inclusion criteria and were selected. From this review, it was found that the Achievement Goal Questionnaire- Sport (AGQ-Sport) and the Task and Ego Orientation in Sport Questionnaire (TEOSQ) were used in most of the research, however, the average weighted impact factor of the Achievement Goal Questionnaire- Sport (AGQ-Sport) is the second highest and most relevant in terms of research articles related to the sport psychology discipline. Task and Ego Orientation in Sport Questionnaire (TEOSQ) is highly popular in cross-cultural adaptation but has the second last average IF among other scales due to the less impact factor of most of the publishing journals. All measures of achievement motivation have Cronbach’s alpha value of more than .70, which is acceptable. The advantages and limitations of each measurement tool are discussed, and the distinction between using implicit and explicit measures of achievement motivation is explained. Overall, both implicit and explicit measures of achievement motivation have different conceptualizations of achievement motivation and are applicable at either the contextual or situational level. The conceptualization and degree of applicability are perhaps the most crucial factors for researchers choosing a questionnaire, even though they differ in their development, reliability, and use.

Keywords: achievement motivation, task and ego orientation, sports psychology, measures of achievement motivation

Procedia PDF Downloads 81
2783 Bioinformatics Identification of Rare Codon Clusters in Proteins Structure of HBV

Authors: Abdorrasoul Malekpour, Mohammad Ghorbani Mojtaba Mortazavi, Mohammadreza Fattahi, Mohammad Hassan Meshkibaf, Ali Fakhrzad, Saeid Salehi, Saeideh Zahedi, Amir Ahmadimoghaddam, Parviz Farzadnia Dr., Mohammadreza Hajyani Asl Bs

Abstract:

Hepatitis B as an infectious disease has eight main genotypes (A–H). The aim of this study is to Bioinformatically identify Rare Codon Clusters (RCC) in proteins structure of HBV. For detection of protein family accession numbers (Pfam) of HBV proteins; used of uni-prot database and Pfam search tool were used. Obtained Pfam IDs were analyzed in Sherlocc program and RCCs in HBV proteins were detected. In further, the structures of TrEMBL entries proteins studied in PDB database and 3D structures of the HBV proteins and locations of RCCs were visualized and studied using Swiss PDB Viewer software. Pfam search tool have found nine significant hits and 0 insignificant hits in 3 frames. Results of Pfams studied in the Sherlocc program show this program not identified RCCs in the external core antigen (PF08290) and truncated HBeAg protein (PF08290). By contrast the RCCs become identified in Hepatitis core antigen (PF00906) Large envelope protein S (PF00695), X protein (PF00739), DNA polymerase (viral) N-terminal domain (PF00242) and Protein P (Pf00336). In HBV genome, seven RCC identified that found in hepatitis core antigen, large envelope protein S and DNA polymerase proteins and proteins structures of TrEMBL entries sequences that reported in Sherlocc program outputs are not complete. Based on situation of RCC in structure of HBV proteins, it suggested those RCCs are important in HBV life cycle. We hoped that this study provide a new and deep perspective in protein research and drug design for treatment of HBV.

Keywords: rare codon clusters, hepatitis B virus, bioinformatic study, infectious disease

Procedia PDF Downloads 470
2782 Experimenting the Influence of Input Modality on Involvement Load Hypothesis

Authors: Mohammad Hassanzadeh

Abstract:

As far as incidental vocabulary learning is concerned, the basic contention of the Involvement Load Hypothesis (ILH) is that retention of unfamiliar words is, generally, conditional upon the degree of involvement in processing them. This study examined input modality and incidental vocabulary uptake in a task-induced setting whereby three variously loaded task types (marginal glosses, fill-in-task, and sentence-writing) were alternately assigned to one group of students at Allameh Tabataba’i University (n=2l) during six classroom sessions. While one round of exposure was comprised of the audiovisual medium (TV talk shows), the second round consisted of textual materials with approximately similar subject matter (reading texts). In both conditions, however, the tasks were equivalent to one another. Taken together, the study pursued the dual objectives of establishing a litmus test for the ILH and its proposed values of ‘need’, ‘search’ and ‘evaluation’ in the first place. Secondly, it sought to bring to light the superiority issue of exposure to audiovisual input versus the written input as far as the incorporation of tasks is concerned. At the end of each treatment session, a vocabulary active recall test was administered to measure their incidental gains. Running a one-way analysis of variance revealed that the audiovisual intervention yielded higher gains than the written version even when differing tasks were included. Meanwhile, task 'three' (sentence-writing) turned out the most efficient in tapping learners' active recall of the target vocabulary items. In addition to shedding light on the superiority of audiovisual input over the written input when circumstances are relatively held constant, this study for the most part, did support the underlying tenets of ILH.

Keywords: Keywords— Evaluation, incidental vocabulary learning, input mode, Involvement Load Hypothesis, need, search.

Procedia PDF Downloads 262
2781 An Energy-Balanced Clustering Method on Wireless Sensor Networks

Authors: Yu-Ting Tsai, Chiun-Chieh Hsu, Yu-Chun Chu

Abstract:

In recent years, due to the development of wireless network technology, many researchers have devoted to the study of wireless sensor networks. The applications of wireless sensor network mainly use the sensor nodes to collect the required information, and send the information back to the users. Since the sensed area is difficult to reach, there are many restrictions on the design of the sensor nodes, where the most important restriction is the limited energy of sensor nodes. Because of the limited energy, researchers proposed a number of ways to reduce energy consumption and balance the load of sensor nodes in order to increase the network lifetime. In this paper, we proposed the Energy-Balanced Clustering method with Auxiliary Members on Wireless Sensor Networks(EBCAM)based on the cluster routing. The main purpose is to balance the energy consumption on the sensed area and average the distribution of dead nodes in order to avoid excessive energy consumption because of the increasing in transmission distance. In addition, we use the residual energy and average energy consumption of the nodes within the cluster to choose the cluster heads, use the multi hop transmission method to deliver the data, and dynamically adjust the transmission radius according to the load conditions. Finally, we use the auxiliary cluster members to change the delivering path according to the residual energy of the cluster head in order to its load. Finally, we compare the proposed method with the related algorithms via simulated experiments and then analyze the results. It reveals that the proposed method outperforms other algorithms in the numbers of used rounds and the average energy consumption.

Keywords: auxiliary nodes, cluster, load balance, routing algorithm, wireless sensor network

Procedia PDF Downloads 264
2780 An Explanatory Study Approach Using Artificial Intelligence to Forecast Solar Energy Outcome

Authors: Agada N. Ihuoma, Nagata Yasunori

Abstract:

Artificial intelligence (AI) techniques play a crucial role in predicting the expected energy outcome and its performance, analysis, modeling, and control of renewable energy. Renewable energy is becoming more popular for economic and environmental reasons. In the face of global energy consumption and increased depletion of most fossil fuels, the world is faced with the challenges of meeting the ever-increasing energy demands. Therefore, incorporating artificial intelligence to predict solar radiation outcomes from the intermittent sunlight is crucial to enable a balance between supply and demand of energy on loads, predict the performance and outcome of solar energy, enhance production planning and energy management, and ensure proper sizing of parameters when generating clean energy. However, one of the major problems of forecasting is the algorithms used to control, model, and predict performances of the energy systems, which are complicated and involves large computer power, differential equations, and time series. Also, having unreliable data (poor quality) for solar radiation over a geographical location as well as insufficient long series can be a bottleneck to actualization. To overcome these problems, this study employs the anaconda Navigator (Jupyter Notebook) for machine learning which can combine larger amounts of data with fast, iterative processing and intelligent algorithms allowing the software to learn automatically from patterns or features to predict the performance and outcome of Solar Energy which in turns enables the balance of supply and demand on loads as well as enhance production planning and energy management.

Keywords: artificial Intelligence, backward elimination, linear regression, solar energy

Procedia PDF Downloads 149
2779 Multiscale Hub: An Open-Source Framework for Practical Atomistic-To-Continuum Coupling

Authors: Masoud Safdari, Jacob Fish

Abstract:

Despite vast amount of existing theoretical knowledge, the implementation of a universal multiscale modeling, analysis, and simulation software framework remains challenging. Existing multiscale software and solutions are often domain-specific, closed-source and mandate a high-level of experience and skills in both multiscale analysis and programming. Furthermore, tools currently existing for Atomistic-to-Continuum (AtC) multiscaling are developed with the assumptions such as accessibility of high-performance computing facilities to the users. These issues mentioned plus many other challenges have reduced the adoption of multiscale in academia and especially industry. In the current work, we introduce Multiscale Hub (MsHub), an effort towards making AtC more accessible through cloud services. As a joint effort between academia and industry, MsHub provides a universal web-enabled framework for practical multiscaling. Developed on top of universally acclaimed scientific programming language Python, the package currently provides an open-source, comprehensive, easy-to-use framework for AtC coupling. MsHub offers an easy to use interface to prominent molecular dynamics and multiphysics continuum mechanics packages such as LAMMPS and MFEM (a free, lightweight, scalable C++ library for finite element methods). In this work, we first report on the design philosophy of MsHub, challenges identified and issues faced regarding its implementation. MsHub takes the advantage of a comprehensive set of tools and algorithms developed for AtC that can be used for a variety of governing physics. We then briefly report key AtC algorithms implemented in MsHub. Finally, we conclude with a few examples illustrating the capabilities of the package and its future directions.

Keywords: atomistic, continuum, coupling, multiscale

Procedia PDF Downloads 163
2778 Artificial Law: Legal AI Systems and the Need to Satisfy Principles of Justice, Equality and the Protection of Human Rights

Authors: Begum Koru, Isik Aybay, Demet Celik Ulusoy

Abstract:

The discipline of law is quite complex and has its own terminology. Apart from written legal rules, there is also living law, which refers to legal practice. Basic legal rules aim at the happiness of individuals in social life and have different characteristics in different branches such as public or private law. On the other hand, law is a national phenomenon. The law of one nation and the legal system applied on the territory of another nation may be completely different. People who are experts in a particular field of law in one country may have insufficient expertise in the law of another country. Today, in addition to the local nature of law, international and even supranational law rules are applied in order to protect basic human values and ensure the protection of human rights around the world. Systems that offer algorithmic solutions to legal problems using artificial intelligence (AI) tools will perhaps serve to produce very meaningful results in terms of human rights. However, algorithms to be used should not be developed by only computer experts, but also need the contribution of people who are familiar with law, values, judicial decisions, and even the social and political culture of the society to which it will provide solutions. Otherwise, even if the algorithm works perfectly, it may not be compatible with the values of the society in which it is applied. The latest developments involving the use of AI techniques in legal systems indicate that artificial law will emerge as a new field in the discipline of law. More AI systems are already being applied in the field of law, with examples such as predicting judicial decisions, text summarization, decision support systems, and classification of documents. Algorithms for legal systems employing AI tools, especially in the field of prediction of judicial decisions and decision support systems, have the capacity to create automatic decisions instead of judges. When the judge is removed from this equation, artificial intelligence-made law created by an intelligent algorithm on its own emerges, whether the domain is national or international law. In this work, the aim is to make a general analysis of this new topic. Such an analysis needs both a literature survey and a perspective from computer experts' and lawyers' point of view. In some societies, the use of prediction or decision support systems may be useful to integrate international human rights safeguards. In this case, artificial law can serve to produce more comprehensive and human rights-protective results than written or living law. In non-democratic countries, it may even be thought that direct decisions and artificial intelligence-made law would be more protective instead of a decision "support" system. Since the values of law are directed towards "human happiness or well-being", it requires that the AI algorithms should always be capable of serving this purpose and based on the rule of law, the principle of justice and equality, and the protection of human rights.

Keywords: AI and law, artificial law, protection of human rights, AI tools for legal systems

Procedia PDF Downloads 59
2777 Study of the Biochemical Properties of the Protease Coagulant Milk Extracted from Sunflower Cake: Manufacturing Test of Cheeses Uncooked Dough Press and Analysis of Sensory Properties

Authors: Kahlouche Amal, Touzene F. Zohra, Betatache Fatihaet Nouani Abdelouahab

Abstract:

The development of the world production of the cheese these last decades, as well as agents' greater request cheap coagulants, accentuated the search for new surrogates of the rennet. What about the interest to explore the vegetable biodiversity, the source well cheap of many naturals metabolites that the scientists today praise it (thistle, latex of fig tree, Cardoon, seeds of melon). Indeed, a big interest is concerned the search for surrogates of vegetable origin. The objective of the study is to show the possibility of extracting a protease coagulant the milk from the cake of Sunflower, available raw material and the potential source of surrogates of rennet. so, the determination of the proteolytic activity of raw extracts, the purification, the elimination of the pigments of tint of the enzymatic preparations, a better knowledge of the coagulative properties through study of the effect of certain factors (temperature, pH, concentration in CaCl2) are so many factors which contribute to value milk particularly those produced by the small ruminants of the Algerian dairy exploitations. Otherwise, extracts coagulants of vegetable origin allowed today to value traditional, in addition, although the extract coagulants of vegetable origin made it possible today to develop traditional cheeses whose Iberian peninsula is the promoter, but the test of 'pressed paste not cooked' cheese manufacturing led to the semi-scale pilot; and that, by using the enzymatic extract of sunflower (Helianthus annus) which gave satisfactory results as well to the level of outputs as on the sensory level,which, statistically,did not give any significant difference between studied cheeses. These results confirm the possibility of use of this coagulase as a substitute of rennet commercial on an industrial scale.

Keywords: characterization, cheese, Rennet, sunflower

Procedia PDF Downloads 336
2776 Sociocultural Foundations of Psychological Well-Being among Ethiopian Adults

Authors: Kassahun Tilahun

Abstract:

Most of the studies available on adult psychological well-being have been centered on Western countries. However, psychological well-being does not have the same meaning across the world. The Euro-American and African conceptions and experiences of psychological well-being differ systematically. As a result, questions like, how do people living in developing African countries, like Ethiopia, report their psychological well-being; what would the context-specific prominent determinants of their psychological well-being be, needs a definitive answer. This study was, therefore, aimed at developing a new theory that would address these socio-cultural issues of psychological well-being. Consequently, data were obtained through interview and open ended questionnaire. A total of 438 adults, working in governmental and non-governmental organizations situated in Addis Ababa, participated in the study. Appropriate qualitative method of data analysis, i.e. thematic content analysis, was employed for analyzing the data. The thematic analysis involves a type of abductive analysis, driven both by theoretical interest and the nature of the data. Reliability and credibility issues were addressed appropriately. The finding identified five major categories of themes, which are viewed as essential in determining the conceptions and experiences of psychological well-being of Ethiopian adults. These were; socio-cultural harmony, social cohesion, security, competence and accomplishment, and the self. Detailed discussion on the rational for including these themes was made and appropriate positive psychology interventions were proposed. Researchers are also encouraged to expand this qualitative research and in turn develop a suitable instrument taping the psychological well-being of adults with different sociocultural orientations.

Keywords: sociocultural, psychological, well-being Ethiopia, adults

Procedia PDF Downloads 536
2775 Signs, Signals and Syndromes: Algorithmic Surveillance and Global Health Security in the 21st Century

Authors: Stephen L. Roberts

Abstract:

This article offers a critical analysis of the rise of syndromic surveillance systems for the advanced detection of pandemic threats within contemporary global health security frameworks. The article traces the iterative evolution and ascendancy of three such novel syndromic surveillance systems for the strengthening of health security initiatives over the past two decades: 1) The Program for Monitoring Emerging Diseases (ProMED-mail); 2) The Global Public Health Intelligence Network (GPHIN); and 3) HealthMap. This article demonstrates how each newly introduced syndromic surveillance system has become increasingly oriented towards the integration of digital algorithms into core surveillance capacities to continually harness and forecast upon infinitely generating sets of digital, open-source data, potentially indicative of forthcoming pandemic threats. This article argues that the increased centrality of the algorithm within these next-generation syndromic surveillance systems produces a new and distinct form of infectious disease surveillance for the governing of emergent pathogenic contingencies. Conceptually, the article also shows how the rise of this algorithmic mode of infectious disease surveillance produces divergences in the governmental rationalities of global health security, leading to the rise of an algorithmic governmentality within contemporary contexts of Big Data and these surveillance systems. Empirically, this article demonstrates how this new form of algorithmic infectious disease surveillance has been rapidly integrated into diplomatic, legal, and political frameworks to strengthen the practice of global health security – producing subtle, yet distinct shifts in the outbreak notification and reporting transparency of states, increasingly scrutinized by the algorithmic gaze of syndromic surveillance.

Keywords: algorithms, global health, pandemic, surveillance

Procedia PDF Downloads 166
2774 Acculturation Impact on Mental Health Among Arab Americans

Authors: Sally Kafelghazal

Abstract:

Introduction: Arab Americans, who include immigrants, refugees, or U.S. born persons of Middle Eastern or North African descent, may experience significant difficulties during acculturation to Western society. Influential stressors include relocation, loss of social support, language barriers, and economic factors, all of which can impact mental health. There is limited research investigating the effects of acculturation on the mental health of the Arab American population. Objectives: The purpose of this study is to identify ways in which acculturation impacts the mental health of Arab Americans, specifically the development of depression and anxiety. Method: A literature search was conducted using PubMed and PsycArticles (ProQuest), utilizing the following search terms: “Arab Americans,” “Arabs,” “mental health,” “depression,” “anxiety,” “acculturation.” Thirty-nine articles were identified and of those, nine specifically investigated the relationship between acculturation and mental health in Arab Americans. Three of the nine focused exclusively on depression. Results: Several risk factors were identified that contribute to poor mental health associated with acculturation, which include immigrant or refugee status, facing discrimination, and religious ideology. Protective factors include greater levels of acculturation, being U.S. born, and greater heritage identity. Greater mental health disorders were identified in Arab Americans compared to normative samples, perhaps particularly depression; none of the articles specifically addressed anxiety. Conclusion: The current research findings support the potential association between the process of acculturation and greater levels of mental health disorders in Arab Americans. However, the diversity of the Arab American population makes it difficult to draw consistent conclusions. Further research needs to be conducted in order to assess which subgroups in the Arab American population are at highest risk for developing new or exacerbating existing mental health disorders in order to devise more effective interventions.

Keywords: arab americans, arabs, mental health, anxiety, depression, acculturation

Procedia PDF Downloads 67
2773 Regret-Regression for Multi-Armed Bandit Problem

Authors: Deyadeen Ali Alshibani

Abstract:

In the literature, the multi-armed bandit problem as a statistical decision model of an agent trying to optimize his decisions while improving his information at the same time. There are several different algorithms models and their applications on this problem. In this paper, we evaluate the Regret-regression through comparing with Q-learning method. A simulation on determination of optimal treatment regime is presented in detail.

Keywords: optimal, bandit problem, optimization, dynamic programming

Procedia PDF Downloads 439
2772 Efficacy of Celecoxib Adjunct Treatment on Bipolar Disorder: Systematic Review and Meta-Analysis

Authors: Daniela V. Bavaresco, Tamy Colonetti, Antonio Jose Grande, Francesc Colom, Joao Quevedo, Samira S. Valvassori, Maria Ines da Rosa

Abstract:

Objective: Performed a systematic review and meta-analysis to evaluated the potential effect of the cyclo-oxygenases (Cox)-2 inhibitor Celecoxib adjunct treatment in Bipolar Disorder (BD), through of randomized controlled trials. Method: A search of the electronic databases was proceeded, on MEDLINE, EMBASE, Scopus, Cochrane Central Register of Controlled Trials (CENTRAL), Biomed Central, Web of Science, IBECS, LILACS, PsycINFO (American Psychological Association), Congress Abstracts, and Grey literature (Google Scholar and the British Library) for studies published from January 1990 to February 2018. A search strategy was developed using the terms: 'Bipolar disorder' or 'Bipolar mania' or 'Bipolar depression' or 'Bipolar mixed' or 'Bipolar euthymic' and 'Celecoxib' or 'Cyclooxygenase-2 inhibitors' or 'Cox-2 inhibitors' as text words and Medical Subject Headings (i.e., MeSH and EMTREE) and searched. The therapeutic effects of adjunctive treatment with Celecoxib were analyzed, it was possible to carry out a meta-analysis of three studies included in the systematic review. The meta-analysis was performed including the final results of the Young Mania Rating Scale (YMRS) at the end of randomized controlled trials (RCT). Results: Three primary studies were included in the systematic review, with a total of 121 patients. The meta-analysis had significant effect in the YMRS scores from patients with BD who used Celecoxib adjuvant treatment in comparison to placebo. The weighted mean difference was 5.54 (95%CI=3.26-7.82); p < 0.001; I2 =0%). Conclusion: The systematic review suggests that adjuvant treatment with Celecoxib improves the response of major treatments in patients with BD when compared with adjuvant placebo treatment.

Keywords: bipolar disorder, Cox-2 inhibitors, Celecoxib, systematic review, meta-analysis

Procedia PDF Downloads 474
2771 Through Additive Manufacturing. A New Perspective for the Mass Production of Made in Italy Products

Authors: Elisabetta Cianfanelli, Paolo Pupparo, Maria Claudia Coppola

Abstract:

The recent evolutions in the innovation processes and in the intrinsic tendencies of the product development process, lead to new considerations on the design flow. The instability and complexity that contemporary life describes, defines new problems in the production of products, stimulating at the same time the adoption of new solutions across the entire design process. The advent of Additive Manufacturing, but also of IOT and AI technologies, continuously puts us in front of new paradigms regarding design as a social activity. The totality of these technologies from the point of view of application describes a whole series of problems and considerations immanent to design thinking. Addressing these problems may require some initial intuition and the use of some provisional set of rules or plausible strategies, i.e., heuristic reasoning. At the same time, however, the evolution of digital technology and the computational speed of new design tools describe a new and contrary design framework in which to operate. It is therefore interesting to understand the opportunities and boundaries of the new man-algorithm relationship. The contribution investigates the man-algorithm relationship starting from the state of the art of the Made in Italy model, the most known fields of application are described and then focus on specific cases in which the mutual relationship between man and AI becomes a new driving force of innovation for entire production chains. On the other hand, the use of algorithms could engulf many design phases, such as the definition of shape, dimensions, proportions, materials, static verifications, and simulations. Operating in this context, therefore, becomes a strategic action, capable of defining fundamental choices for the design of product systems in the near future. If there is a human-algorithm combination within a new integrated system, quantitative values can be controlled in relation to qualitative and material values. The trajectory that is described therefore becomes a new design horizon in which to operate, where it is interesting to highlight the good practices that already exist. In this context, the designer developing new forms can experiment with ways still unexpressed in the project and can define a new synthesis and simplification of algorithms, so that each artifact has a signature in order to define in all its parts, emotional and structural. This signature of the designer, a combination of values and design culture, will be internal to the algorithms and able to relate to digital technologies, creating a generative dialogue for design purposes. The result that is envisaged indicates a new vision of digital technologies, no longer understood only as of the custodians of vast quantities of information, but also as a valid integrated tool in close relationship with the design culture.

Keywords: decision making, design euristics, product design, product design process, design paradigms

Procedia PDF Downloads 102