Search results for: speeded up robust features
1635 Niche Authorities and Social Activism: Interrogating the Activities of Selected Bloggers in Ghana
Authors: Akosua Asantewaa Anane
Abstract:
Social media and its networking sites have become beneficial to society. With the advent of Web 2.0, many people are becoming technologically savvy and attracted to internet-based activities. With the click of a button, users are now sharing more information on topics, events and issues than before. A new phenomenon in the Ghanaian journalism sphere is the advent of blogger and citizen journalism, some of whom have become niche authorities. Niche authorities have emerged through the habitual and persistent curation of news on specific topics, resulting in the steady growth and emergence of valuable contributions to news sharing. Minimal studies have been conducted on niche authorities and their role in social activism in Ghana. This study, anchored on Cialdini’s Six Principles of Persuasion (reciprocation, consistency, social proof, liking, authority and scarcity), explores the features of niche authorities, their areas of expertise, as well as their authoritative voices in the curation of news stories. Using qualitative content analysis, cyber ethnography and thematic analysis of purposively sampled social media posts of five niche authorities, the study interrogates how these niche authorities employ the six principles of persuasion on their platforms to spark conversations on development, social inclusion and gender-based issues in the country. The study discusses how niche authorities deploy the principles in social activism and further recommends nurturing and mentoring communication strategies to progressively guide the youth to become future niche authorities in news curation and news sharing.Keywords: social activism, cialdini’s six principles of persuasion, news curation, niche authorities
Procedia PDF Downloads 641634 Modelling and Numerical Analysis of Thermal Non-Destructive Testing on Complex Structure
Authors: Y. L. Hor, H. S. Chu, V. P. Bui
Abstract:
Composite material is widely used to replace conventional material, especially in the aerospace industry to reduce the weight of the devices. It is formed by combining reinforced materials together via adhesive bonding to produce a bulk material with alternated macroscopic properties. In bulk composites, degradation may occur in microscopic scale, which is in each individual reinforced fiber layer or especially in its matrix layer such as delamination, inclusion, disbond, void, cracks, and porosity. In this paper, we focus on the detection of defect in matrix layer which the adhesion between the composite plies is in contact but coupled through a weak bond. In fact, the adhesive defects are tested through various nondestructive methods. Among them, pulsed phase thermography (PPT) has shown some advantages providing improved sensitivity, large-area coverage, and high-speed testing. The aim of this work is to develop an efficient numerical model to study the application of PPT to the nondestructive inspection of weak bonding in composite material. The resulting thermal evolution field is comprised of internal reflections between the interfaces of defects and the specimen, and the important key-features of the defects presented in the material can be obtained from the investigation of the thermal evolution of the field distribution. Computational simulation of such inspections has allowed the improvement of the techniques to apply in various inspections, such as materials with high thermal conductivity and more complex structures.Keywords: pulsed phase thermography, weak bond, composite, CFRP, computational modelling, optimization
Procedia PDF Downloads 1731633 Automatic Aggregation and Embedding of Microservices for Optimized Deployments
Authors: Pablo Chico De Guzman, Cesar Sanchez
Abstract:
Microservices are a software development methodology in which applications are built by composing a set of independently deploy-able, small, modular services. Each service runs a unique process and it gets instantiated and deployed in one or more machines (we assume that different microservices are deployed into different machines). Microservices are becoming the de facto standard for developing distributed cloud applications due to their reduced release cycles. In principle, the responsibility of a microservice can be as simple as implementing a single function, which can lead to the following issues: - Resource fragmentation due to the virtual machine boundary. - Poor communication performance between microservices. Two composition techniques can be used to optimize resource fragmentation and communication performance: aggregation and embedding of microservices. Aggregation allows the deployment of a set of microservices on the same machine using a proxy server. Aggregation helps to reduce resource fragmentation, and is particularly useful when the aggregated services have a similar scalability behavior. Embedding deals with communication performance by deploying on the same virtual machine those microservices that require a communication channel (localhost bandwidth is reported to be about 40 times faster than cloud vendor local networks and it offers better reliability). Embedding can also reduce dependencies on load balancer services since the communication takes place on a single virtual machine. For example, assume that microservice A has two instances, a1 and a2, and it communicates with microservice B, which also has two instances, b1 and b2. One embedding can deploy a1 and b1 on machine m1, and a2 and b2 are deployed on a different machine m2. This deployment configuration allows each pair (a1-b1), (a2-b2) to communicate using the localhost interface without the need of a load balancer between microservices A and B. Aggregation and embedding techniques are complex since different microservices might have incompatible runtime dependencies which forbid them from being installed on the same machine. There is also a security concern since the attack surface between microservices can be larger. Luckily, container technology allows to run several processes on the same machine in an isolated manner, solving the incompatibility of running dependencies and the previous security concern, thus greatly simplifying aggregation/embedding implementations by just deploying a microservice container on the same machine as the aggregated/embedded microservice container. Therefore, a wide variety of deployment configurations can be described by combining aggregation and embedding to create an efficient and robust microservice architecture. This paper presents a formal method that receives a declarative definition of a microservice architecture and proposes different optimized deployment configurations by aggregating/embedding microservices. The first prototype is based on i2kit, a deployment tool also submitted to ICWS 2018. The proposed prototype optimizes the following parameters: network/system performance, resource usage, resource costs and failure tolerance.Keywords: aggregation, deployment, embedding, resource allocation
Procedia PDF Downloads 2021632 Wireworms under the Sword of Damocles: Attraction to Maize Root Volatiles
Authors: Diana La Forgia, Jean Baptiste Thibord, François Verheggen
Abstract:
Volatiles Organic Compound (VOCs) are one of the many features of defense used by plants in their eternal fight against pests. Their main role is to attract the natural enemies of the herbivores. But on another hand, they can be used by the same herbivores to locate plants while foraging. In an attempt to fill a gap of knowledge in a complex web of interactions, we focused on wireworms (Coleoptera:Elateridae). Wireworms whose larvae feed on roots are one of the most spread pests of valuable crops such as maize and potatoes, causing important economical damage. Little is known about the root compounds that are playing a role in the attraction of the larvae. In order to know more about these compounds, we compared four different maize varieties (Zea mays mays) that are known to have different levels of attraction, from weak to strong, for wireworms in fields. We tested the attraction of larvae in laboratory conditions in dual-choice olfactometer assays where they were offered all possible combinations of the four maize varieties. Contemporary, we collected the VOCs of each variety during 24h using a push-and-pull system. The collected samples were then analyzed by gas chromatography coupled with a mass spectrometer (GC-MS) to identify their molecular profiles. The choice of the larvae was dependent on the offered combination and some varieties were preferred to others. Differences were also observed in terms of quantitative and qualitative emissions of volatile profiles between the maize varieties. Our aim is to develop traps based on VOCs from maize roots to open a new frontier in wireworms management.Keywords: integrated pest management, maize roots, plant defense, volatile organic compounds, wireworms
Procedia PDF Downloads 1541631 Corporate Governance and Disclosure Quality: Taxonomy of Tunisian Listed Firms Using the Decision Tree Method Based Approach
Authors: Wided Khiari, Adel Karaa
Abstract:
This study aims to establish a typology of Tunisian listed firms according to their corporate governance characteristics and disclosure quality. The paper uses disclosed scores to examine corporate governance practices of Tunisian listed firms. A content analysis of 46 Tunisian listed firms from 2001 to 2010 has been carried out and a disclosure index developed to determine the level of disclosure of the companies. The disclosure quality is appreciated through the quantity and also through the nature (type) of information disclosed. Applying the decision tree method, the obtained tree diagrams provide ways to know the characteristics of a particular firm regardless of its level of disclosure. Obtained results show that the characteristics of corporate governance to achieve good quality of disclosure are not unique for all firms. These structures are not necessarily all of the recommendations of best practices, but converge towards the best combination. Indeed, in practice, there are companies which have a good quality of disclosure, but are not well-governed. However, we hope that by improving their governance system their level of disclosure may be better. These findings show, in a general way, a convergence towards the standards of corporate governance with a few exceptions related to the specificity of Tunisian listed firms and show the need for the adoption of a code for each context. These findings shed the light on corporate governance features that enhance incentives for good disclosure. It allows identifying, for each firm and in any date, corporate governance determinants of disclosure quality. More specifically, and all being equal, obtained tree makes a rule of decision for the company to know the level of disclosure based on certain characteristics of the governance strategy adopted by the latter.Keywords: corporate governance, disclosure, decision tree, economics
Procedia PDF Downloads 3331630 Enhancing Athlete Training using Real Time Pose Estimation with Neural Networks
Authors: Jeh Patel, Chandrahas Paidi, Ahmed Hambaba
Abstract:
Traditional methods for analyzing athlete movement often lack the detail and immediacy required for optimal training. This project aims to address this limitation by developing a Real-time human pose estimation system specifically designed to enhance athlete training across various sports. This system leverages the power of convolutional neural networks (CNNs) to provide a comprehensive and immediate analysis of an athlete’s movement patterns during training sessions. The core architecture utilizes dilated convolutions to capture crucial long-range dependencies within video frames. Combining this with the robust encoder-decoder architecture to further refine pose estimation accuracy. This capability is essential for precise joint localization across the diverse range of athletic poses encountered in different sports. Furthermore, by quantifying movement efficiency, power output, and range of motion, the system provides data-driven insights that can be used to optimize training programs. Pose estimation data analysis can also be used to develop personalized training plans that target specific weaknesses identified in an athlete’s movement patterns. To overcome the limitations posed by outdoor environments, the project employs strategies such as multi-camera configurations or depth sensing techniques. These approaches can enhance pose estimation accuracy in challenging lighting and occlusion scenarios, where pose estimation accuracy in challenging lighting and occlusion scenarios. A dataset is collected From the labs of Martin Luther King at San Jose State University. The system is evaluated through a series of tests that measure its efficiency and accuracy in real-world scenarios. Results indicate a high level of precision in recognizing different poses, substantiating the potential of this technology in practical applications. Challenges such as enhancing the system’s ability to operate in varied environmental conditions and further expanding the dataset for training were identified and discussed. Future work will refine the model’s adaptability and incorporate haptic feedback to enhance the interactivity and richness of the user experience. This project demonstrates the feasibility of an advanced pose detection model and lays the groundwork for future innovations in assistive enhancement technologies.Keywords: computer vision, deep learning, human pose estimation, U-NET, CNN
Procedia PDF Downloads 521629 An Evolutionary Approach for QAOA for Max-Cut
Authors: Francesca Schiavello
Abstract:
This work aims to create a hybrid algorithm, combining Quantum Approximate Optimization Algorithm (QAOA) with an Evolutionary Algorithm (EA) in the place of traditional gradient based optimization processes. QAOA’s were first introduced in 2014, where, at the time, their algorithm performed better than the traditional best known classical algorithm for Max-cut graphs. Whilst classical algorithms have improved since then and have returned to being faster and more efficient, this was a huge milestone for quantum computing, and their work is often used as a benchmarking tool and a foundational tool to explore variants of QAOA’s. This, alongside with other famous algorithms like Grover’s or Shor’s, highlights to the world the potential that quantum computing holds. It also presents the reality of a real quantum advantage where, if the hardware continues to improve, this could constitute a revolutionary era. Given that the hardware is not there yet, many scientists are working on the software side of things in the hopes of future progress. Some of the major limitations holding back quantum computing are the quality of qubits and the noisy interference they generate in creating solutions, the barren plateaus that effectively hinder the optimization search in the latent space, and the availability of number of qubits limiting the scale of the problem that can be solved. These three issues are intertwined and are part of the motivation for using EAs in this work. Firstly, EAs are not based on gradient or linear optimization methods for the search in the latent space, and because of their freedom from gradients, they should suffer less from barren plateaus. Secondly, given that this algorithm performs a search in the solution space through a population of solutions, it can also be parallelized to speed up the search and optimization problem. The evaluation of the cost function, like in many other algorithms, is notoriously slow, and the ability to parallelize it can drastically improve the competitiveness of QAOA’s with respect to purely classical algorithms. Thirdly, because of the nature and structure of EA’s, solutions can be carried forward in time, making them more robust to noise and uncertainty. Preliminary results show that the EA algorithm attached to QAOA can perform on par with the traditional QAOA with a Cobyla optimizer, which is a linear based method, and in some instances, it can even create a better Max-Cut. Whilst the final objective of the work is to create an algorithm that can consistently beat the original QAOA, or its variants, due to either speedups or quality of the solution, this initial result is promising and show the potential of EAs in this field. Further tests need to be performed on an array of different graphs with the parallelization aspect of the work commencing in October 2023 and tests on real hardware scheduled for early 2024.Keywords: evolutionary algorithm, max cut, parallel simulation, quantum optimization
Procedia PDF Downloads 581628 The Debureaucratization Strategy for the Portuguese Health Service through Effective Communication
Authors: Fernando Araujo, Sandra Cardoso, Fátima Fonseca, Sandra Cavaca
Abstract:
A debureaucratization strategy for the Portuguese Health Service was assumed by the Executive Board of the SNS, in deep articulation with the Shared Services of the Ministry of Health. Two of the main dimensions were focused on sick leaves (SL), that transform primary health care (PHC) in administrative institutions, limiting access to patients. The self-declaration of illness (SDI) project, through the National Health Service Contact Centre (SNS24), began on May 1, 2023, and has already resulted in the issuance of more than 300,000 SDI without the need to allocate resources from the National Health Service (NHS). This political decision allows each citizen, in a maximum 2 times/year, and 3 days each time, if ill, through their own responsibility, report their health condition in a dematerialized way, and by this way justified the absence to work, although by Portuguese law in these first three days, there is no payment of salary. Using a digital approach, it is now feasible without the need to go to the PHC and occupy the time of the PHC only to obtain an SL. Through this measure, bureaucracy has been reduced, and the system has been focused on users, improving the lives of citizens and reducing the administrative burden on PHC, which now has more consultation times for users who need it. The second initiative, which began on March 1, 2024, allows the SL to be issued in emergency departments (ED) of public hospitals and in the health institutions of the social and private sectors. This project is intended to allow the user who has suffered a situation of acute urgent illness and who has been observed in an ED of a public hospital or in a private or social entity no longer need to go to PHC only to apply for the respective SL. Since March 1, 54,453 SLs have been issued, 242 in private or social sector institutions and 6,918 in public hospitals, of which 134 were in ED and 47,292 in PHC. This approach has proven to be technically robust, allows immediate resolution of problems and differentiates the performance of doctors. However, it is important to continue to qualify the proper functioning of the ED, preventing non-urgent users from going there only to obtain SL. Thus, in order to make better use of existing resources, it was operationalizing this extension of its issuance in a balanced way, allowing SL to be issued in the ED of hospitals only to critically ill patients or patients referred by INEM, SNS24, or PHC. In both cases, an intense public campaign was implemented to explain the way it works and the benefits for patients. In satisfaction surveys, more than 95% of patients and doctors were satisfied with the solutions, asking for extensions to other areas. The administrative simplification agenda of the NHS continues its effective development. For the success of this debureaucratization agenda, the key factors are effective communication and the ability to reach patients and health professionals in order to increase health literacy and the correct use of NHS.Keywords: debureaucratization strategy, self-declaration of illness, sick leaves, SNS24
Procedia PDF Downloads 671627 Composite Approach to Extremism and Terrorism Web Content Classification
Authors: Kolade Olawande Owoeye, George Weir
Abstract:
Terrorism and extremism activities on the internet are becoming the most significant threats to national security because of their potential dangers. In response to this challenge, law enforcement and security authorities are actively implementing comprehensive measures by countering the use of the internet for terrorism. To achieve the measures, there is need for intelligence gathering via the internet. This includes real-time monitoring of potential websites that are used for recruitment and information dissemination among other operations by extremist groups. However, with billions of active webpages, real-time monitoring of all webpages become almost impossible. To narrow down the search domain, there is a need for efficient webpage classification techniques. This research proposed a new approach tagged: SentiPosit-based method. SentiPosit-based method combines features of the Posit-based method and the Sentistrenght-based method for classification of terrorism and extremism webpages. The experiment was carried out on 7500 webpages obtained through TENE-webcrawler by International Cyber Crime Research Centre (ICCRC). The webpages were manually grouped into three classes which include the ‘pro-extremist’, ‘anti-extremist’ and ‘neutral’ with 2500 webpages in each category. A supervised learning algorithm is then applied on the classified dataset in order to build the model. Results obtained was compared with existing classification method using the prediction accuracy and runtime. It was observed that our proposed hybrid approach produced a better classification accuracy compared to existing approaches within a reasonable runtime.Keywords: sentiposit, classification, extremism, terrorism
Procedia PDF Downloads 2761626 Development and Validation of a Green Analytical Method for the Analysis of Daptomycin Injectable by Fourier-Transform Infrared Spectroscopy (FTIR)
Authors: Eliane G. Tótoli, Hérida Regina N. Salgado
Abstract:
Daptomycin is an important antimicrobial agent used in clinical practice nowadays, since it is very active against some Gram-positive bacteria that are particularly challenges for the medicine, such as methicillin-resistant Staphylococcus aureus (MRSA) and vancomycin-resistant Enterococci (VRE). The importance of environmental preservation has receiving special attention since last years. Considering the evident need to protect the natural environment and the introduction of strict quality requirements regarding analytical procedures used in pharmaceutical analysis, the industries must seek environmentally friendly alternatives in relation to the analytical methods and other processes that they follow in their routine. In view of these factors, green analytical chemistry is prevalent and encouraged nowadays. In this context, infrared spectroscopy stands out. This is a method that does not use organic solvents and, although it is formally accepted for the identification of individual compounds, also allows the quantitation of substances. Considering that there are few green analytical methods described in literature for the analysis of daptomycin, the aim of this work was the development and validation of a green analytical method for the quantification of this drug in lyophilized powder for injectable solution, by Fourier-transform infrared spectroscopy (FT-IR). Method: Translucent potassium bromide pellets containing predetermined amounts of the drug were prepared and subjected to spectrophotometric analysis in the mid-infrared region. After obtaining the infrared spectrum and with the assistance of the IR Solution software, quantitative analysis was carried out in the spectral region between 1575 and 1700 cm-1, related to a carbonyl band of the daptomycin molecule, and this band had its height analyzed in terms of absorbance. The method was validated according to ICH guidelines regarding linearity, precision (repeatability and intermediate precision), accuracy and robustness. Results and discussion: The method showed to be linear (r = 0.9999), precise (RSD% < 2.0), accurate and robust, over a concentration range from 0.2 to 0.6 mg/pellet. In addition, this technique does not use organic solvents, which is one great advantage over the most common analytical methods. This fact contributes to minimize the generation of organic solvent waste by the industry and thereby reduces the impact of its activities on the environment. Conclusion: The validated method proved to be adequate to quantify daptomycin in lyophilized powder for injectable solution and can be used for its routine analysis in quality control. In addition, the proposed method is environmentally friendly, which is in line with the global trend.Keywords: daptomycin, Fourier-transform infrared spectroscopy, green analytical chemistry, quality control, spectrometry in IR region
Procedia PDF Downloads 3801625 Highly Glazed Office Spaces: Simulated Visual Comfort vs Real User Experiences
Authors: Zahra Hamedani, Ebrahim Solgi, Henry Skates, Gillian Isoardi
Abstract:
Daylighting plays a pivotal role in promoting productivity and user satisfaction in office spaces. There is an ongoing trend in designing office buildings with a high proportion of glazing which relatively increases the risk of high visual discomfort. Providing a more realistic lighting analysis can be of high value at the early stages of building design when necessary changes can be made at a very low cost. This holistic approach can be achieved by incorporating subjective evaluation and user behaviour in computer simulation and provide a comprehensive lighting analysis. In this research, a detailed computer simulation model has been made using Radiance and Daysim. Afterwards, this model was validated by measurements and user feedback. The case study building is the school of science at Griffith University, Gold Coast, Queensland, which features highly glazed office spaces. In this paper, the visual comfort predicted by the model is compared with a preliminary survey of the building users to evaluate how user behaviour such as desk position, orientation selection, and user movement caused by daylight changes and other visual variations can inform perceptions of visual comfort. This work supports preliminary design analysis of visual comfort incorporating the effects of gaze shift patterns and views with the goal of designing effective layout for office spaces.Keywords: lighting simulation, office buildings, user behaviour, validation, visual comfort
Procedia PDF Downloads 2121624 Hand Gesture Recognition for Sign Language: A New Higher Order Fuzzy HMM Approach
Authors: Saad M. Darwish, Magda M. Madbouly, Murad B. Khorsheed
Abstract:
Sign Languages (SL) are the most accomplished forms of gestural communication. Therefore, their automatic analysis is a real challenge, which is interestingly implied to their lexical and syntactic organization levels. Hidden Markov models (HMM’s) have been used prominently and successfully in speech recognition and, more recently, in handwriting recognition. Consequently, they seem ideal for visual recognition of complex, structured hand gestures such as are found in sign language. In this paper, several results concerning static hand gesture recognition using an algorithm based on Type-2 Fuzzy HMM (T2FHMM) are presented. The features used as observables in the training as well as in the recognition phases are based on Singular Value Decomposition (SVD). SVD is an extension of Eigen decomposition to suit non-square matrices to reduce multi attribute hand gesture data to feature vectors. SVD optimally exposes the geometric structure of a matrix. In our approach, we replace the basic HMM arithmetic operators by some adequate Type-2 fuzzy operators that permits us to relax the additive constraint of probability measures. Therefore, T2FHMMs are able to handle both random and fuzzy uncertainties existing universally in the sequential data. Experimental results show that T2FHMMs can effectively handle noise and dialect uncertainties in hand signals besides a better classification performance than the classical HMMs. The recognition rate of the proposed system is 100% for uniform hand images and 86.21% for cluttered hand images.Keywords: hand gesture recognition, hand detection, type-2 fuzzy logic, hidden Markov Model
Procedia PDF Downloads 4601623 Use of Front-Face Fluorescence Spectroscopy and Multiway Analysis for the Prediction of Olive Oil Quality Features
Authors: Omar Dib, Rita Yaacoub, Luc Eveleigh, Nathalie Locquet, Hussein Dib, Ali Bassal, Christophe B. Y. Cordella
Abstract:
The potential of front-face fluorescence coupled with chemometric techniques, namely parallel factor analysis (PARAFAC) and multiple linear regression (MLR) as a rapid analysis tool to characterize Lebanese virgin olive oils was investigated. Fluorescence fingerprints were acquired directly on 102 Lebanese virgin olive oil samples in the range of 280-540 nm in excitation and 280-700 nm in emission. A PARAFAC model with seven components was considered optimal with a residual of 99.64% and core consistency value of 78.65. The model revealed seven main fluorescence profiles in olive oil and was mainly associated with tocopherols, polyphenols, chlorophyllic compounds and oxidation/hydrolysis products. 23 MLR regression models based on PARAFAC scores were generated, the majority of which showed a good correlation coefficient (R > 0.7 for 12 predicted variables), thus satisfactory prediction performances. Acid values, peroxide values, and Delta K had the models with the highest predictions, with R values of 0.89, 0.84 and 0.81 respectively. Among fatty acids, linoleic and oleic acids were also highly predicted with R values of 0.8 and 0.76, respectively. Factors contributing to the model's construction were related to common fluorophores found in olive oil, mainly chlorophyll, polyphenols, and oxidation products. This study demonstrates the interest of front-face fluorescence as a promising tool for quality control of Lebanese virgin olive oils.Keywords: front-face fluorescence, Lebanese virgin olive oils, multiple Linear regressions, PARAFAC analysis
Procedia PDF Downloads 4511622 Tolerance of Ambiguity in Relation to Listening Performance across Learners of Various Linguistic Backgrounds
Authors: Amin Kaveh Boukani
Abstract:
Foreign language learning is not straightforward and can be affected by numerous factors, among which personality features like tolerance of ambiguity (TA) are so well-known and important. Such characteristics yet can be affected by other factors like learning additional languages. The current investigation, thus, opted to explore the possible effect of linguistic background (being bilingual or trilingual) on the tolerance of ambiguity (TA) of Iranian EFL learners. Furthermore, the possible mediating effect of TA on multilingual learners' language performance (listening comprehension in this study) was expounded. This research involved 68 EFL learners (32 bilinguals, 29 trilinguals) with the age range of 19-29 doing their degrees in the Department of English Language and Literature of Urmia University. A set of questionnaires, including tolerance of ambiguity (Herman et. al., 2010) and linguistic background information (Modirkhameneh, 2005), as well as the IELTS listening comprehension test, were used for data collection purposes. The results of a set of independent samples t-test and mediation analysis (Hayes, 2022) showed that (1) linguistic background (being bilingual or trilingual) had a significant direct effect on EFL learners' TA, (2) Linguistic background had a significant direct influence on listening comprehension, (3) TA had a substantial direct influence on listening comprehension, and (4) TA moderated the influence of linguistic background on listening comprehension considerably. These results suggest that multilingualism may be considered as an advantageous asset for EFL learners and should be a prioritized characteristic in EFL instruction in multilingual contexts. Further pedagogical implications and suggestions for research are proposed in light of effective EFL instruction in multilingual contexts.Keywords: tolerance of ambiguity, listening comprehension, multilingualism, bilingual, trilingual
Procedia PDF Downloads 591621 An Automatic Bayesian Classification System for File Format Selection
Authors: Roman Graf, Sergiu Gordea, Heather M. Ryan
Abstract:
This paper presents an approach for the classification of an unstructured format description for identification of file formats. The main contribution of this work is the employment of data mining techniques to support file format selection with just the unstructured text description that comprises the most important format features for a particular organisation. Subsequently, the file format indentification method employs file format classifier and associated configurations to support digital preservation experts with an estimation of required file format. Our goal is to make use of a format specification knowledge base aggregated from a different Web sources in order to select file format for a particular institution. Using the naive Bayes method, the decision support system recommends to an expert, the file format for his institution. The proposed methods facilitate the selection of file format and the quality of a digital preservation process. The presented approach is meant to facilitate decision making for the preservation of digital content in libraries and archives using domain expert knowledge and specifications of file formats. To facilitate decision-making, the aggregated information about the file formats is presented as a file format vocabulary that comprises most common terms that are characteristic for all researched formats. The goal is to suggest a particular file format based on this vocabulary for analysis by an expert. The sample file format calculation and the calculation results including probabilities are presented in the evaluation section.Keywords: data mining, digital libraries, digital preservation, file format
Procedia PDF Downloads 4971620 The Role of Libraries in the Context of Indian Knowledge Based Society
Authors: Sanjeev Sharma
Abstract:
We are living in the information age. Information is not only important to an individual but also to researchers, scientists, academicians and all others who are doing work in their respective fields. The 21st century which is also known as the electronic era has brought several changes in the mechanism of the libraries in their working environment. In the present scenario, acquisition of information resources and implementation of new strategies have brought a revolution in the library’s structures and their principles. In the digital era, the role of the library has become important as new information is coming at every minute. The knowledge society wants to seek information at their desk. The libraries are managing electronic services and web-based information sources constantly in a democratic way. The basic objective of every library is to save the time of user which is based on the quality and user-orientation of services. With the advancement of information communication and technology, the libraries should pay more devotion to the development trends of the information society that would help to adjust their development strategies and information needs of the knowledge society. The knowledge-based society demands to re-define the position and objectives of all the institutions which work with information, knowledge, and culture. The situation is the era of digital India is changing at a fast speed. Everyone wants information 24x7 and libraries have been recognized as one of the key elements for open access to information, which is crucial not only to individual but also to democratic knowledge-based information society. Libraries are especially important now a day the whole concept of education is focusing more and more independent e-learning and their acting. The citizens of India must be able to find and use the relevant information. Here we can see libraries enter the stage: The essential features of libraries are to acquire, organize, store and retrieve for use and preserve publicly available material irrespective of the print as well as non-print form in which it is packaged in such a way that, when it is needed, it can be found and put to use.Keywords: knowledge, society, libraries, culture
Procedia PDF Downloads 1391619 Analysis of the Role of Creative Tourism in Sustainable Tourism Development Case Study: Isfahan City
Authors: Saman Shafei
Abstract:
Tourism has improved for several reasons, with the main objective of producing economic benefits, including foreign exchange earnings, income generation, employment, rising government incomes, and contributing to the financing of tourism infrastructure, which also has public consumption. Although today the interests of the tourism industry are not overlooked by anyone, the expansion and development of tourism services and products can make it competitive, and in this competition, those who bring creativity and diversity are ahead of other competitors. Developing creative tourism as third-generation tourism can help to attract visitors, increasing demand and diversifying it, achieving new markets and boosting growth. Creative tourism is a journey aimed at achieving a brand –new experience and is along with collaborative learning of arts, cultural heritage, or specific features of a place, and provides useful communication with the inhabitants of the tourism destination who is creators of the living culture of that place. The present study aims to identify and introduce the capabilities of the city of Isfahan in IRAN for the development of creative tourism and the role of creative tourism on the destination and the local community of this city. The research method is descriptive-analytical and field method, interviewing tool and questionnaire have been applied to obtain research findings. The results indicate that the city of Isfahan has the potential to develop creative tourism in the field of traditional handicrafts and traditional foods, and developing this kind of tourism will lead to the development of sustainable tourism in this destination and will bring numerous benefits for the local community.Keywords: creative tourism, tourism, Isfahan city, sustainable tourism development
Procedia PDF Downloads 2241618 The Formulation of R&D Strategy for Biofuel Technology: A Case Study of the Aviation Industry in Iran
Authors: Maryam Amiri, Ali Rajabzade, Gholam Reza Goudarzi, Reza Heidari
Abstract:
Growth of technology and environmental changes are so fast and therefore, companies and industries have much tendency to do activities of R&D for active participation in the market and achievement to a competitive advantages. Aviation industry and its subdivisions have high level technology and play a special role in economic and social development of countries. So, in the aviation industry for getting new technologies and competing with other countries aviation industry, there is a requirement for capability in R&D. Considering of appropriate R&D strategy is supportive that day technologies of the world can be achieved. Biofuel technology is one of the newest technologies that has allocated discussion of the world in aviation industry to itself. The purpose of this research has been formulation of R&D strategy of biofuel technology in aviation industry of Iran. After reviewing of the theoretical foundations of the methods and R&D strategies, finally we classified R&D strategies in four main categories as follows: internal R&D, collaboration R&D, out sourcing R&D and in-house R&D. After a review of R&D strategies, a model for formulation of R&D strategy with the aim of developing biofuel technology in aviation industry in Iran was offered. With regard to the requirements and aracteristics of industry and technology in the model, we presented an integrated approach to R&D. Based on the techniques of decision making and analyzing of structured expert opinion, 4 R&D strategies for different scenarios and with the aim of developing biofuel technology in aviation industry in Iran were recommended. In this research, based on the common features of the implementation process of R&D, a logical classification of these methods are presented as R&D strategies. Then, R&D strategies and their characteristics was developed according to the experts. In the end, we introduced a model to consider the role of aviation industry and biofuel technology in R&D strategies. And lastly, for conditions and various scenarios of the aviation industry, we have formulated a specific R&D strategy.Keywords: aviation industry, biofuel technology, R&D, R&D strategy
Procedia PDF Downloads 5761617 Disease Level Assessment in Wheat Plots Using a Residual Deep Learning Algorithm
Authors: Felipe A. Guth, Shane Ward, Kevin McDonnell
Abstract:
The assessment of disease levels in crop fields is an important and time-consuming task that generally relies on expert knowledge of trained individuals. Image classification in agriculture problems historically has been based on classical machine learning strategies that make use of hand-engineered features in the top of a classification algorithm. This approach tends to not produce results with high accuracy and generalization to the classes classified by the system when the nature of the elements has a significant variability. The advent of deep convolutional neural networks has revolutionized the field of machine learning, especially in computer vision tasks. These networks have great resourcefulness of learning and have been applied successfully to image classification and object detection tasks in the last years. The objective of this work was to propose a new method based on deep learning convolutional neural networks towards the task of disease level monitoring. Common RGB images of winter wheat were obtained during a growing season. Five categories of disease levels presence were produced, in collaboration with agronomists, for the algorithm classification. Disease level tasks performed by experts provided ground truth data for the disease score of the same winter wheat plots were RGB images were acquired. The system had an overall accuracy of 84% on the discrimination of the disease level classes.Keywords: crop disease assessment, deep learning, precision agriculture, residual neural networks
Procedia PDF Downloads 3311616 Model-Driven and Data-Driven Approaches for Crop Yield Prediction: Analysis and Comparison
Authors: Xiangtuo Chen, Paul-Henry Cournéde
Abstract:
Crop yield prediction is a paramount issue in agriculture. The main idea of this paper is to find out efficient way to predict the yield of corn based meteorological records. The prediction models used in this paper can be classified into model-driven approaches and data-driven approaches, according to the different modeling methodologies. The model-driven approaches are based on crop mechanistic modeling. They describe crop growth in interaction with their environment as dynamical systems. But the calibration process of the dynamic system comes up with much difficulty, because it turns out to be a multidimensional non-convex optimization problem. An original contribution of this paper is to propose a statistical methodology, Multi-Scenarios Parameters Estimation (MSPE), for the parametrization of potentially complex mechanistic models from a new type of datasets (climatic data, final yield in many situations). It is tested with CORNFLO, a crop model for maize growth. On the other hand, the data-driven approach for yield prediction is free of the complex biophysical process. But it has some strict requirements about the dataset. A second contribution of the paper is the comparison of these model-driven methods with classical data-driven methods. For this purpose, we consider two classes of regression methods, methods derived from linear regression (Ridge and Lasso Regression, Principal Components Regression or Partial Least Squares Regression) and machine learning methods (Random Forest, k-Nearest Neighbor, Artificial Neural Network and SVM regression). The dataset consists of 720 records of corn yield at county scale provided by the United States Department of Agriculture (USDA) and the associated climatic data. A 5-folds cross-validation process and two accuracy metrics: root mean square error of prediction(RMSEP), mean absolute error of prediction(MAEP) were used to evaluate the crop prediction capacity. The results show that among the data-driven approaches, Random Forest is the most robust and generally achieves the best prediction error (MAEP 4.27%). It also outperforms our model-driven approach (MAEP 6.11%). However, the method to calibrate the mechanistic model from dataset easy to access offers several side-perspectives. The mechanistic model can potentially help to underline the stresses suffered by the crop or to identify the biological parameters of interest for breeding purposes. For this reason, an interesting perspective is to combine these two types of approaches.Keywords: crop yield prediction, crop model, sensitivity analysis, paramater estimation, particle swarm optimization, random forest
Procedia PDF Downloads 2291615 The Survey Research and Evaluation of Green Residential Building Based on the Improved Group Analytical Hierarchy Process Method in Yinchuan
Abstract:
Due to the economic downturn and the deterioration of the living environment, the development of residential buildings as high energy consuming building is gradually changing from “extensive” to green building in China. So, the evaluation system of green building is continuously improved, but the current evaluation work has the following problems: (1) There are differences in the cost of the actual investment and the purchasing power of residents, also construction target of green residential building is single and lacks multi-objective performance development. (2) Green building evaluation lacks regional characteristics and cannot reflect the different regional residents demand. (3) In the process of determining the criteria weight, the experts’ judgment matrix is difficult to meet the requirement of consistency. Therefore, to solve those problems, questionnaires which are about the green residential building for Ningxia area are distributed, and the results of questionnaires can feedback the purchasing power of residents and the acceptance of the green building cost. Secondly, combined with the geographical features of Ningxia minority areas, the evaluation criteria system of green residential building is constructed. Finally, using the improved group AHP method and the grey clustering method, the criteria weight is determined, and a real case is evaluated, which is located in Xing Qing district, Ningxia. A conclusion can be obtained that the professional evaluation for this project and good social recognition is basically the same.Keywords: evaluation, green residential building, grey clustering method, group AHP
Procedia PDF Downloads 3961614 Torn Between the Lines of Border: The Pakhtuns of Pakistan and Afghanistan in Search of Identity
Authors: Priyanka Dutta Chowdhury
Abstract:
A globalized connected world, calling loud for a composite culture, was still not able to erase the pain of a desired nationalism based on cultural identity. In the South Asian region, the random drawing of the boundaries without taking the ethnic aspect into consideration have always challenged the very basis of the existence of certain groups. The urge to reunify with the fellow brothers on both sides of the border have always called for a chaos and schism in the countries of this region. Sometimes this became a tool to bargain with the state and find a favorable position in the power structure on the basis of cultural identity. In Pakistan and Afghanistan, the Pakhtuns who are divided across the border of the two countries, from the inception of creation of Pakistan have posed various challenges and hampered the growth of a consolidated nation. The Pakhtuns or Pashtuns of both Pakistan and Afghanistan have a strong cultural affinity which blurs their physical distancing and calls for a nationalism based on this ethnic affiliation. Both the sides wanted to create Pakhtunistan unifying all the Pakhtuns of the region. For long, this group have denied to accept the Durand line separating the two. This was an area of concern especially for the Pakhtuns of Pakistan torn between the decision either to join Afghanistan, create a nation of their own or be a part of Pakistan. This ethnic issue became a bone of contention between the two countries. Later, though well absorbed and recognized in the respective countries, they have fought for their identity and claimed for a dominant position in the politics of the nations. Because of the porous borders often influx of refugees was seen especially during Afghan Wars and later many extremists’ groups were born from them especially the Taliban. In the recent string of events, when the Taliban, who are mostly Pakhtuns ethnically, came in power in Afghanistan, a wave of sympathy arose in Pakistan. This gave a strengthening position to the religious Pakhtuns across the border. It is to be noted here that a fragmented Pakhtun identity between the religious and seculars were clearly visible, voicing for their place in the political hierarchy of the country with a vision distinct from each other especially in Pakistan. In this context the paper tries to evaluate the reasons for this cultural turmoil between the countries and this ethnic group. It also aims to analyze the concept of how the identity politics still holds its relevance in the contemporary world. Additionally, the recent trend of fragmented identity points towards instrumentalization of this ethnic groups, who are engaged in the bargaining process with the state for a robust position in the power structure. In the end, the paper aims to deduct from the theoretical conditions of identity politics, whether this is a primordial or a situational tool to have a visibility in the power structure of the contemporary world.Keywords: cultural identity, identity politics, instrumentalization of identity pakhtuns, power structure
Procedia PDF Downloads 811613 A Study and Design Scarf Collection Applied Vietnamese Traditional Patterns by Using Printing Method on Fabric
Authors: Mai Anh Pham Ho
Abstract:
Scarf products today is a symbol of fashion to decorate, to make our life more beautiful and bring new features to our living space. It also shows the cultural identity by using the traditional patterns that make easily to introduce the image of Vietnam to other nations all over the world. Therefore, the purpose of this research is to classify Vietnamese traditional patterns according to the era and dynasties. Vietnamese traditional patterns through the dynasties of Vietnamese history are done and classified by five groups of patterns including the geometric patterns, the natural patterns, the animal patterns, the floral patterns, and the character patterns in the Prehistoric times, the Bronze and Iron age, the Chinese domination, the Ngo-Dinh-TienLe-Ly-Tran-Ho dynasty, and the LeSo-Mac-LeTrinh-TaySon-Nguyen dynasty. Besides, there are some special kinds of Vietnamese traditional patterns like buffalo, lotus, bronze-drum, Phuc Loc Tho character, and so on. Extensive research was conducted for modernizing scarf collection applied Vietnamese traditional patterns which the fashion trend is used on creating works. The concept, target, image map, lifestyle map, motif, colours, arrangement and completion of patterns on scarf were set up. The scarf collection is designed and developed by the Adobe Illustrator program with three colour ways for each scarf. Upon completion of the research, digital printing technology is chosen for using on scarf collection which Vietnamese traditional patterns were researched deeply and widely with the purpose of establishment the basic background for Vietnamese culture in order to identify Vietnamese national personality as well as establish and preserve the cultural heritage.Keywords: scarf collection, Vietnamese traditional patterns, printing methods, fabric design
Procedia PDF Downloads 3401612 Planning and Urban Climate Change Adaptation: Italian Literature Review
Authors: Mara Balestrieri
Abstract:
Climate change has long been the focus of attention for the growing impact of extreme weather events and global warming in many areas of the planet and the evidence of economic, social, and environmental damage caused by global warming. Nowadays, climate change is recognized as a critical global problem. Several initiatives have been undertaken over time to enhance the long theoretical debate and field experience in order to reduce Co2 emissions and contain climate alteration. However, the awareness that climate change is already taking place has led to a growing demand for adaptation. It is certainly a matter of anticipating the negative effects of climate change but, at the same time, implementing appropriate actions to prevent climate change-related damage, minimize the problems that may result, and also seize any opportunities that may arise. Consequently, adaptation has become a core element of climate policy and research. However, the attention to this issue has not developed in a uniform manner across countries. Some countries are further ahead than others. This paper examines the literature on climate change adaptation developed until 2018 in Italy, considering the urban dimension, to provide a framework for it, and to identify main topics and features. The papers were selected from Scopus and were analyzed through a matrix that we propose. Results demonstrate that adaptation to climate change studies attracted increasing attention from Italian scientific communities in the last years, although Italian scientific production is still quantitatively lower than in other countries and describes strengths and weaknesses in line with international panorama with respect to objectives, sectors, and problems.Keywords: adaptation, bibliometric literature, climate change, urban studies
Procedia PDF Downloads 721611 DNA Barcoding Application in Study of Icthyo- Biodiversity in Rivers of Pakistan
Authors: Asma Karim
Abstract:
Fish taxonomy plays a fundamental role in the study of biodiversity. However, traditional methods of fish taxonomy rely on morphological features, which can lead to confusion due to great similarities between closely related species. To overcome this limitation, modern taxonomy employs DNA barcoding as a species identification method. This involves using a short standardized mitochondrial DNA region as a barcode, specifically a 658 base pair fragment near the 5′ ends of the mitochondrial cytochrome c oxidase subunit 1 (CO1) gene, to exploit the diversity in this region for identification of species. To test the effectiveness and reliability of DNA barcoding, 25 fish specimens from nine different fish species found in various rivers of Pakistan were identified morphologically using a dichotomous key at the start of the study. Comprising nine freshwater fish species, including Mystus cavasius, Mystus bleekeri, Osteobrama cotio, Labeo rohita, Labeo culbasu, Labeo gonius, Cyprinus carpio, Catla catla and Cirrhinus mrigala from different rivers of Pakistan were used in the present study. DNA was extracted from one of the pectoral fins and a partial sequence of CO1 gene was amplified using the conventional PCR method. Analysis of the barcodes confirmed that genetically identified fishes were the same as those identified morphologically at the beginning of the study. The sequences were also analyzed for biodiversity and phylogenetic studies. Based on the results of the study, it can be concluded that DNA barcoding is an effective and reliable method for studying biodiversity and conducting phylogenetic analysis of different fish species in Pakistan.Keywords: DNA barcoding, fresh water fishes, taxonomy, biodiversity, Pakistan
Procedia PDF Downloads 1061610 Characteristics and Flight Test Analysis of a Fixed-Wing UAV with Hover Capability
Authors: Ferit Çakıcı, M. Kemal Leblebicioğlu
Abstract:
In this study, characteristics and flight test analysis of a fixed-wing unmanned aerial vehicle (UAV) with hover capability is analyzed. The base platform is chosen as a conventional airplane with throttle, ailerons, elevator and rudder control surfaces, that inherently allows level flight. Then this aircraft is mechanically modified by the integration of vertical propellers as in multi rotors in order to provide hover capability. The aircraft is modeled using basic aerodynamical principles and linear models are constructed utilizing small perturbation theory for trim conditions. Flight characteristics are analyzed by benefiting from linear control theory’s state space approach. Distinctive features of the aircraft are discussed based on analysis results with comparison to conventional aircraft platform types. A hybrid control system is proposed in order to reveal unique flight characteristics. The main approach includes design of different controllers for different modes of operation and a hand-over logic that makes flight in an enlarged flight envelope viable. Simulation tests are performed on mathematical models that verify asserted algorithms. Flight tests conducted in real world revealed the applicability of the proposed methods in exploiting fixed-wing and rotary wing characteristics of the aircraft, which provide agility, survivability and functionality.Keywords: flight test, flight characteristics, hybrid aircraft, unmanned aerial vehicle
Procedia PDF Downloads 3271609 Structure Clustering for Milestoning Applications of Complex Conformational Transitions
Authors: Amani Tahat, Serdal Kirmizialtin
Abstract:
Trajectory fragment methods such as Markov State Models (MSM), Milestoning (MS) and Transition Path sampling are the prime choice of extending the timescale of all atom Molecular Dynamics simulations. In these approaches, a set of structures that covers the accessible phase space has to be chosen a priori using cluster analysis. Structural clustering serves to partition the conformational state into natural subgroups based on their similarity, an essential statistical methodology that is used for analyzing numerous sets of empirical data produced by Molecular Dynamics (MD) simulations. Local transition kernel among these clusters later used to connect the metastable states using a Markovian kinetic model in MSM and a non-Markovian model in MS. The choice of clustering approach in constructing such kernel is crucial since the high dimensionality of the biomolecular structures might easily confuse the identification of clusters when using the traditional hierarchical clustering methodology. Of particular interest, in the case of MS where the milestones are very close to each other, accurate determination of the milestone identity of the trajectory becomes a challenging issue. Throughout this work we present two cluster analysis methods applied to the cis–trans isomerism of dinucleotide AA. The choice of nucleic acids to commonly used proteins to study the cluster analysis is two fold: i) the energy landscape is rugged; hence transitions are more complex, enabling a more realistic model to study conformational transitions, ii) Nucleic acids conformational space is high dimensional. A diverse set of internal coordinates is necessary to describe the metastable states in nucleic acids, posing a challenge in studying the conformational transitions. Herein, we need improved clustering methods that accurately identify the AA structure in its metastable states in a robust way for a wide range of confused data conditions. The single linkage approach of the hierarchical clustering available in GROMACS MD-package is the first clustering methodology applied to our data. Self Organizing Map (SOM) neural network, that also known as a Kohonen network, is the second data clustering methodology. The performance comparison of the neural network as well as hierarchical clustering method is studied by means of computing the mean first passage times for the cis-trans conformational rates. Our hope is that this study provides insight into the complexities and need in determining the appropriate clustering algorithm for kinetic analysis. Our results can improve the effectiveness of decisions based on clustering confused empirical data in studying conformational transitions in biomolecules.Keywords: milestoning, self organizing map, single linkage, structure clustering
Procedia PDF Downloads 2221608 Technology Road Mapping in the Fourth Industrial Revolution: A Comprehensive Analysis and Strategic Framework
Authors: Abdul Rahman Hamdan
Abstract:
The Fourth Industrial Revolution (4IR) has brought unprecedented technological advancements that have disrupted many industries worldwide. In keeping up with the technological advances and rapid disruption by the introduction of many technological advancements brought forth by the 4IR, the use of technology road mapping has emerged as one of the critical tools for organizations to leverage. Technology road mapping can be used by many companies to guide them to become more adaptable and anticipate future transformation and innovation, and avoid being redundant or irrelevant due to the rapid changes in technological advancement. This research paper provides a comprehensive analysis of technology road mapping within the context of the 4IR. The objectives of the paper are to provide companies with practical insights and a strategic framework of technology road mapping for them to navigate the fast-changing nature of the 4IR. This study also contributes to the understanding and practice of technology road mapping in the 4IR and, at the same time, provides organizations with the necessary tools and critical insight to navigate the 4IR transformation by leveraging technology road mapping. Based on the literature review and case studies, the study analyses key principles, methodologies, and best practices in technology road mapping and integrates them with the unique characteristics and challenges of the 4IR. The research paper gives the background of the fourth industrial revolution. It explores the disruptive potential of technologies in the 4IR and the critical need for technology road mapping that consists of strategic planning and foresight to remain competitive and relevant in the 4IR era. It also highlights the importance of technology road mapping as an organisation’s proactive approach to align the organisation’s objectives and resources to their technology and product development in meeting the fast-evolving technological 4IR landscape. The paper also includes the theoretical foundations of technology road mapping and examines various methodological approaches, and identifies external stakeholders in the process, such as external experts, stakeholders, collaborative platforms, and cross-functional teams to ensure an integrated and robust technological roadmap for the organisation. Moreover, this study presents a comprehensive framework for technology road mapping in the 4IR by incorporating key elements and processes such as technology assessment, competitive intelligence, risk analysis, and resource allocation. It provides a framework for implementing technology road mapping from strategic planning, goal setting, and technology scanning to road mapping visualisation, implementation planning, monitoring, and evaluation. In addition, the study also addresses the challenges and limitations related to technology roadmapping in 4IR, including the gap analysis. In conclusion of the study, the study will propose a set of practical recommendations for organizations that intend to leverage technology road mapping as a strategic tool in the 4IR in driving innovation and becoming competitive in the current and future ecosystem.Keywords: technology management, technology road mapping, technology transfer, technology planning
Procedia PDF Downloads 671607 Decision-Making Under Uncertainty in Obsessive-Compulsive Disorder
Authors: Helen Pushkarskaya, David Tolin, Lital Ruderman, Ariel Kirshenbaum, J. MacLaren Kelly, Christopher Pittenger, Ifat Levy
Abstract:
Obsessive-Compulsive Disorder (OCD) produces profound morbidity. Difficulties with decision making and intolerance of uncertainty are prominent clinical features of OCD. The nature and etiology of these deficits are poorly understood. We used a well-validated choice task, grounded in behavioral economic theory, to investigate differences in valuation and value-based choice during decision making under uncertainty in 20 unmedicated participants with OCD and 20 matched healthy controls. Participants’ choices were used to assess individual decision-making characteristics. Compared to controls, individuals with OCD were less consistent in their choices and less able to identify options that were unambiguously preferable. These differences correlated with symptom severity. OCD participants did not differ from controls in how they valued uncertain options when outcome probabilities were known (risk) but were more likely than controls to avoid uncertain options when these probabilities were imprecisely specified (ambiguity). These results suggest that the underlying neural mechanisms of valuation and value-based choices during decision-making are abnormal in OCD. Individuals with OCD show elevated intolerance of uncertainty, but only when outcome probabilities are themselves uncertain. Future research focused on the neural valuation network, which is implicated in value-based computations, may provide new neurocognitive insights into the pathophysiology of OCD. Deficits in decision-making processes may represent a target for therapeutic intervention.Keywords: obsessive compulsive disorder, decision-making, uncertainty intolerance, risk aversion, ambiguity aversion, valuation
Procedia PDF Downloads 6141606 A Large Ion Collider Experiment (ALICE) Diffractive Detector Control System for RUN-II at the Large Hadron Collider
Authors: J. C. Cabanillas-Noris, M. I. Martínez-Hernández, I. León-Monzón
Abstract:
The selection of diffractive events in the ALICE experiment during the first data taking period (RUN-I) of the Large Hadron Collider (LHC) was limited by the range over which rapidity gaps occur. It would be possible to achieve better measurements by expanding the range in which the production of particles can be detected. For this purpose, the ALICE Diffractive (AD0) detector has been installed and commissioned for the second phase (RUN-II). Any new detector should be able to take the data synchronously with all other detectors and be operated through the ALICE central systems. One of the key elements that must be developed for the AD0 detector is the Detector Control System (DCS). The DCS must be designed to operate safely and correctly this detector. Furthermore, the DCS must also provide optimum operating conditions for the acquisition and storage of physics data and ensure these are of the highest quality. The operation of AD0 implies the configuration of about 200 parameters, from electronics settings and power supply levels to the archiving of operating conditions data and the generation of safety alerts. It also includes the automation of procedures to get the AD0 detector ready for taking data in the appropriate conditions for the different run types in ALICE. The performance of AD0 detector depends on a certain number of parameters such as the nominal voltages for each photomultiplier tube (PMT), their threshold levels to accept or reject the incoming pulses, the definition of triggers, etc. All these parameters define the efficiency of AD0 and they have to be monitored and controlled through AD0 DCS. Finally, AD0 DCS provides the operator with multiple interfaces to execute these tasks. They are realized as operating panels and scripts running in the background. These features are implemented on a SCADA software platform as a distributed control system which integrates to the global control system of the ALICE experiment.Keywords: AD0, ALICE, DCS, LHC
Procedia PDF Downloads 304