Search results for: laser line detection
432 Motivational Profiles of the Entrepreneurial Career in Spanish Businessmen
Authors: Magdalena Suárez-Ortega, M. Fe. Sánchez-García
Abstract:
This paper focuses on the analysis of the motivations that lead people to undertake and consolidate their business. It is addressed from the framework of planned behavior theory, which recognizes the importance of the social environment and cultural values, both in the decision to undertake business and in business consolidation. Similarly, it is also based on theories of career development, which emphasize the importance of career management competencies and their connections to other vital aspects of people, including their roles within their families and other personal activities. This connects directly with the impact of entrepreneurship on the career and the professional-personal project of each individual. This study is part of the project titled Career Design and Talent Management (Ministry of Economy and Competitiveness of Spain, State Plan 2013-2016 Excellence Ref. EDU2013-45704-P). The aim of the study is to identify and describe entrepreneurial competencies and motivational profiles in a sample of 248 Spanish entrepreneurs, considering the consolidated profile and the profile in transition (n = 248).In order to obtain the information, the Questionnaire of Motivation and conditioners of the entrepreneurial career (MCEC) has been applied. This consists of 67 items and includes four scales (E1-Conflicts in conciliation, E2-Satisfaction in the career path, E3-Motivations to undertake, E4-Guidance Needs). Cluster analysis (mixed method, combining k-means clustering with a hierarchical method) was carried out, characterizing the groups profiles according to the categorical variables (chi square, p = 0.05), and the quantitative variables (ANOVA). The results have allowed us to characterize three motivational profiles relevant to the motivation, the degree of conciliation between personal and professional life, and the degree of conflict in conciliation, levels of career satisfaction and orientation needs (in the entrepreneurial project and life-career). The first profile is formed by extrinsically motivated entrepreneurs, professionally satisfied and without conflict of vital roles. The second profile acts with intrinsic motivation and also associated with family models, and although it shows satisfaction with their professional career, it finds a high conflict in their family and professional life. The third is composed of entrepreneurs with high extrinsic motivation, professional dissatisfaction and at the same time, feel the conflict in their professional life by the effect of personal roles. Ultimately, the analysis has allowed us to line the kinds of entrepreneurs to different levels of motivation, satisfaction, needs and articulation in professional and personal life, showing characterizations associated with the use of time for leisure, and the care of the family. Associations related to gender, age, activity sector, environment (rural, urban, virtual), and the use of time for domestic tasks are not identified. The model obtained and its implications for the design of training actions and orientation to entrepreneurs is also discussed.Keywords: motivation, entrepreneurial career, guidance needs, life-work balance, job satisfaction, assessment
Procedia PDF Downloads 303431 Anti-Graft Instruments and Their Role in Curbing Corruption: Integrity Pact and Its Impact on Indian Procurement
Authors: Jot Prakash Kaur
Abstract:
The paper aims to showcase that with the introduction of anti-graft instruments and willingness of the governments towards their implementation, a significant change can be witnessed in the anti-corruption landscape of any country. Since the past decade anti-graft instruments have been introduced by several international non-governmental organizations with the vision of curbing corruption. Transparency International’s ‘Integrity Pact’ has been one such initiative. Integrity Pact has been described as a tool for preventing corruption in public contracting. Integrity Pact has found its relevance in a developing country like India where public procurement constitutes 25-30 percent of Gross Domestic Product. Corruption in public procurement has been a cause of concern even though India has in place a whole architecture of rules and regulations governing public procurement. Integrity Pact was first adopted by a leading Oil and Gas government company in 2006. Till May 2015, over ninety organizations had adopted Integrity Pact, of which majority of them are central government units. The methodology undertaken to understand impact of Integrity Pact on Public procurement is through analyzing information received from important stakeholders of the instrument. Government, information was sought through Right to Information Act 2005 about the details of adoption of this instrument by various government organizations and departments. Contractor, Company websites and annual reports were used to find out the steps taken towards implementation of Integrity Pact. Civil Society, Transparency International India’s resource materials which include publications and reports on Integrity Pact were also used to understand the impact of Integrity Pact. Some of the findings of the study include organizations adopting Integrity pacts in all kinds of contracts such that 90% of their procurements fall under Integrity Pact. Indian State governments have found merit in Integrity Pact and have adopted it in their procurement contracts. Integrity Pact has been instrumental in creating a brand image of companies. External Monitors, an essential feature of Integrity Pact have emerged as arbitrators for the bidders and are the first line of procurement auditors for the organizations. India has cancelled two defense contracts finding it conflicting with the provisions of Integrity Pact. Some of the clauses of Integrity Pact have been included in the proposed Public Procurement legislation. Integrity Pact has slowly but steadily grown to become an integral part of big ticket procurement in India. Government’s commitment to implement Integrity Pact has changed the way in which public procurement is conducted in India. Public Procurement was a segment infested with corruption but with the adoption of Integrity Pact a number of clean up acts have been performed to make procurement transparent. The paper is divided in five sections. First section elaborates on Integrity Pact. Second section talks about stakeholders of the instrument and the role it plays in its implementation. Third section talks about the efforts taken by the government to implement Integrity Pact in India. Fourth section talks about the role of External Monitor as Arbitrator. The final section puts forth suggestions to strengthen the existing form of Integrity Pact and increase its reach.Keywords: corruption, integrity pact, procurement, vigilance
Procedia PDF Downloads 342430 Effects of Soil Neutron Irradiation in Soil Carbon Neutron Gamma Analysis
Authors: Aleksandr Kavetskiy, Galina Yakubova, Nikolay Sargsyan, Stephen A. Prior, H. Allen Torbert
Abstract:
The carbon sequestration question of modern times requires the development of an in-situ method of measuring soil carbon over large landmasses. Traditional chemical analytical methods used to evaluate large land areas require extensive soil sampling prior to processing for laboratory analysis; collectively, this is labor-intensive and time-consuming. An alternative method is to apply nuclear physics analysis, primarily in the form of pulsed fast-thermal neutron-gamma soil carbon analysis. This method is based on measuring the gamma-ray response that appears upon neutron irradiation of soil. Specific gamma lines with energies of 4.438 MeV appearing from neutron irradiation can be attributed to soil carbon nuclei. Based on measuring gamma line intensity, assessments of soil carbon concentration can be made. This method can be done directly in the field using a specially developed pulsed fast-thermal neutron-gamma system (PFTNA system). This system conducts in-situ analysis in a scanning mode coupled with GPS, which provides soil carbon concentration and distribution over large fields. The system has radiation shielding to minimize the dose rate (within radiation safety guidelines) for safe operator usage. Questions concerning the effect of neutron irradiation on soil health will be addressed. Information regarding absorbed neutron and gamma dose received by soil and its distribution with depth will be discussed in this study. This information was generated based on Monte-Carlo simulations (MCNP6.2 code) of neutron and gamma propagation in soil. Received data were used for the analysis of possible induced irradiation effects. The physical, chemical and biological effects of neutron soil irradiation were considered. From a physical aspect, we considered neutron (produced by the PFTNA system) induction of new isotopes and estimated the possibility of increasing the post-irradiation gamma background by comparisons to the natural background. An insignificant increase in gamma background appeared immediately after irradiation but returned to original values after several minutes due to the decay of short-lived new isotopes. From a chemical aspect, possible radiolysis of water (presented in soil) was considered. Based on stimulations of radiolysis of water, we concluded that the gamma dose rate used cannot produce gamma rays of notable rates. Possible effects of neutron irradiation (by the PFTNA system) on soil biota were also assessed experimentally. No notable changes were noted at the taxonomic level, nor was functional soil diversity affected. Our assessment suggested that the use of a PFTNA system with a neutron flux of 1e7 n/s for soil carbon analysis does not notably affect soil properties or soil health.Keywords: carbon sequestration, neutron gamma analysis, radiation effect on soil, Monte-Carlo simulation
Procedia PDF Downloads 145429 Control of Belts for Classification of Geometric Figures by Artificial Vision
Authors: Juan Sebastian Huertas Piedrahita, Jaime Arturo Lopez Duque, Eduardo Luis Perez Londoño, Julián S. Rodríguez
Abstract:
The process of generating computer vision is called artificial vision. The artificial vision is a branch of artificial intelligence that allows the obtaining, processing, and analysis of any type of information especially the ones obtained through digital images. Actually the artificial vision is used in manufacturing areas for quality control and production, as these processes can be realized through counting algorithms, positioning, and recognition of objects that can be measured by a single camera (or more). On the other hand, the companies use assembly lines formed by conveyor systems with actuators on them for moving pieces from one location to another in their production. These devices must be previously programmed for their good performance and must have a programmed logic routine. Nowadays the production is the main target of every industry, quality, and the fast elaboration of the different stages and processes in the chain of production of any product or service being offered. The principal base of this project is to program a computer that recognizes geometric figures (circle, square, and triangle) through a camera, each one with a different color and link it with a group of conveyor systems to organize the mentioned figures in cubicles, which differ from one another also by having different colors. This project bases on artificial vision, therefore the methodology needed to develop this project must be strict, this one is detailed below: 1. Methodology: 1.1 The software used in this project is QT Creator which is linked with Open CV libraries. Together, these tools perform to realize the respective program to identify colors and forms directly from the camera to the computer. 1.2 Imagery acquisition: To start using the libraries of Open CV is necessary to acquire images, which can be captured by a computer’s web camera or a different specialized camera. 1.3 The recognition of RGB colors is realized by code, crossing the matrices of the captured images and comparing pixels, identifying the primary colors which are red, green, and blue. 1.4 To detect forms it is necessary to realize the segmentation of the images, so the first step is converting the image from RGB to grayscale, to work with the dark tones of the image, then the image is binarized which means having the figure of the image in a white tone with a black background. Finally, we find the contours of the figure in the image to detect the quantity of edges to identify which figure it is. 1.5 After the color and figure have been identified, the program links with the conveyor systems, which through the actuators will classify the figures in their respective cubicles. Conclusions: The Open CV library is a useful tool for projects in which an interface between a computer and the environment is required since the camera obtains external characteristics and realizes any process. With the program for this project any type of assembly line can be optimized because images from the environment can be obtained and the process would be more accurate.Keywords: artificial intelligence, artificial vision, binarized, grayscale, images, RGB
Procedia PDF Downloads 380428 Citation Analysis of New Zealand Court Decisions
Authors: Tobias Milz, L. Macpherson, Varvara Vetrova
Abstract:
The law is a fundamental pillar of human societies as it shapes, controls and governs how humans conduct business, behave and interact with each other. Recent advances in computer-assisted technologies such as NLP, data science and AI are creating opportunities to support the practice, research and study of this pervasive domain. It is therefore not surprising that there has been an increase in investments into supporting technologies for the legal industry (also known as “legal tech” or “law tech”) over the last decade. A sub-discipline of particular appeal is concerned with assisted legal research. Supporting law researchers and practitioners to retrieve information from the vast amount of ever-growing legal documentation is of natural interest to the legal research community. One tool that has been in use for this purpose since the early nineteenth century is legal citation indexing. Among other use cases, they provided an effective means to discover new precedent cases. Nowadays, computer-assisted network analysis tools can allow for new and more efficient ways to reveal the “hidden” information that is conveyed through citation behavior. Unfortunately, access to openly available legal data is still lacking in New Zealand and access to such networks is only commercially available via providers such as LexisNexis. Consequently, there is a need to create, analyze and provide a legal citation network with sufficient data to support legal research tasks. This paper describes the development and analysis of a legal citation Network for New Zealand containing over 300.000 decisions from 125 different courts of all areas of law and jurisdiction. Using python, the authors assembled web crawlers, scrapers and an OCR pipeline to collect and convert court decisions from openly available sources such as NZLII into uniform and machine-readable text. This facilitated the use of regular expressions to identify references to other court decisions from within the decision text. The data was then imported into a graph-based database (Neo4j) with the courts and their respective cases represented as nodes and the extracted citations as links. Furthermore, additional links between courts of connected cases were added to indicate an indirect citation between the courts. Neo4j, as a graph-based database, allows efficient querying and use of network algorithms such as PageRank to reveal the most influential/most cited courts and court decisions over time. This paper shows that the in-degree distribution of the New Zealand legal citation network resembles a power-law distribution, which indicates a possible scale-free behavior of the network. This is in line with findings of the respective citation networks of the U.S. Supreme Court, Austria and Germany. The authors of this paper provide the database as an openly available data source to support further legal research. The decision texts can be exported from the database to be used for NLP-related legal research, while the network can be used for in-depth analysis. For example, users of the database can specify the network algorithms and metrics to only include specific courts to filter the results to the area of law of interest.Keywords: case citation network, citation analysis, network analysis, Neo4j
Procedia PDF Downloads 110427 Identification of Damage Mechanisms in Interlock Reinforced Composites Using a Pattern Recognition Approach of Acoustic Emission Data
Authors: M. Kharrat, G. Moreau, Z. Aboura
Abstract:
The latest advances in the weaving industry, combined with increasingly sophisticated means of materials processing, have made it possible to produce complex 3D composite structures. Mainly used in aeronautics, composite materials with 3D architecture offer better mechanical properties than 2D reinforced composites. Nevertheless, these materials require a good understanding of their behavior. Because of the complexity of such materials, the damage mechanisms are multiple, and the scenario of their appearance and evolution depends on the nature of the exerted solicitations. The AE technique is a well-established tool for discriminating between the damage mechanisms. Suitable sensors are used during the mechanical test to monitor the structural health of the material. Relevant AE-features are then extracted from the recorded signals, followed by a data analysis using pattern recognition techniques. In order to better understand the damage scenarios of interlock composite materials, a multi-instrumentation was set-up in this work for tracking damage initiation and development, especially in the vicinity of the first significant damage, called macro-damage. The deployed instrumentation includes video-microscopy, Digital Image Correlation, Acoustic Emission (AE) and micro-tomography. In this study, a multi-variable AE data analysis approach was developed for the discrimination between the different signal classes representing the different emission sources during testing. An unsupervised classification technique was adopted to perform AE data clustering without a priori knowledge. The multi-instrumentation and the clustered data served to label the different signal families and to build a learning database. This latter is useful to construct a supervised classifier that can be used for automatic recognition of the AE signals. Several materials with different ingredients were tested under various solicitations in order to feed and enrich the learning database. The methodology presented in this work was useful to refine the damage threshold for the new generation materials. The damage mechanisms around this threshold were highlighted. The obtained signal classes were assigned to the different mechanisms. The isolation of a 'noise' class makes it possible to discriminate between the signals emitted by damages without resorting to spatial filtering or increasing the AE detection threshold. The approach was validated on different material configurations. For the same material and the same type of solicitation, the identified classes are reproducible and little disturbed. The supervised classifier constructed based on the learning database was able to predict the labels of the classified signals.Keywords: acoustic emission, classifier, damage mechanisms, first damage threshold, interlock composite materials, pattern recognition
Procedia PDF Downloads 156426 Platelet Volume Indices: Emerging Markers of Diabetic Thrombocytopathy
Authors: Mitakshara Sharma, S. K. Nema
Abstract:
Diabetes mellitus (DM) is metabolic disorder prevalent in pandemic proportions, incurring significant morbidity and mortality due to associated vascular angiopathies. Platelet related thrombogenesis plays key role in pathogenesis of these complications. Most patients with type II DM suffer from preventable vascular complications and early diagnosis can help manage these successfully. These complications are attributed to platelet activation which can be recognised by the increase in Platelet Volume Indices(PVI) viz. Mean Platelet Volume(MPV) and Platelet Distribution Width(PDW). This study was undertaken with the aim of finding a relationship between PVI and vascular complications of Diabetes mellitus, their importance as a causal factor in these complications and use as markers for early detection of impending vascular complications in patients with poor glycaemic status. This is a cross-sectional study conducted for 2 years with total 930 subjects. The subjects were segregated in 03 groups on basis of glycosylated haemoglobin (HbA1C) as: - (a) Diabetic, (b) Non-Diabetic and (c) Subjects with Impaired fasting glucose(IFG) with 300 individuals in IFG and non-diabetic group & 330 individuals in diabetic group. The diabetic group was further divided into two groups: - (a) Diabetic subjects with diabetes related vascular complications (b) Diabetic subjects without diabetes related vascular complications. Samples for HbA1C and platelet indices were collected using Ethylene diamine tetracetic acid(EDTA) as anticoagulant and processed on SYSMEX-XS-800i autoanalyser. The study revealed stepwise increase in PVI from non-diabetics to IFG to diabetics. MPV and PDW of diabetics, IFG and non diabetics were 17.60 ± 2.04, 11.76 ± 0.73, 9.93 ± 0.64 and 19.17 ± 1.48, 15.49 ± 0.67, 10.59 ± 0.67 respectively with a significant p value 0.00 and a significant positive correlation (MPV-HbA1c r = 0.951; PDW-HbA1c r = 0.875). However, significant negative correlation was found between glycaemic levels and total platelet count (PC- HbA1c r =-0.164). MPV & PDW of subjects with and without diabetes related complications were (15.14 ± 1.04) fl & (17.51±0.39) fl and (18.96 ± 0.83) fl & (20.09 ± 0.98) fl respectively with a significant p value 0.00.The current study demonstrates raised platelet indices & reduced platelet counts in association with rising glycaemic levels and diabetes related vascular complications across various study groups & showed that platelet morphology is altered with increasing glycaemic levels. These changes can be known by measurements of PVI which are important, simple, cost effective, effortless tool & indicators of impending vascular complications in patients with deranged glycaemic control. PVI should be researched and explored further as surrogate markers to develop a clinical tool for early recognition of vascular changes related to diabetes and thereby help prevent them. They can prove to be more useful in developing countries with limited resources. This study is multi-parameter, comprehensive with adequately powered study design and represents pioneering effort in India on account of the fact that both Platelet indices (MPV & PDW) along with platelet count have been evaluated together for the first time in Diabetics, non diabetics, patients with IFG and also in the diabetic patients with and without diabetes related vascular complications.Keywords: diabetes, HbA1C, IFG, MPV, PDW, PVI
Procedia PDF Downloads 240425 Television Sports Exposure and Rape Myth Acceptance: The Mediating Role of Sexual Objectification of Women
Authors: Sofia Mariani, Irene Leo
Abstract:
The objective of the present study is to define the mediating role of attitudes that objectify and devalue women (hostile sexism, benevolent sexism, and sexual objectification of women) in the indirect correlation between exposure to televised sports and acceptance of rape myths. A second goal is to contribute to research on the topic by defining the role of mediators in exposure to different types of sports, following the traditional gender classification of sports. Data collection was carried out by means of an online questionnaire, measuring television sport exposure, sport type, hostile sexism, benevolent sexism, and sexual objectification of women. Data analysis was carried out using IBM SPSS software. The model used was created using Ordinary Least Squares (OLS) regression path analysis. The predictor variable in the model was television sports exposure, the outcome was rape myths acceptance, and the mediators were (1) hostile sexism, (2) benevolent sexism, and (3) sexual objectification of women. Correlation analyses were carried out dividing by sport type and controlling for the participants’ gender. As seen in existing literature, television sports exposure was found to be indirectly and positively related to rape myth acceptance through the mediating role of: (1) hostile sexism, (2) benevolent sexism, and (3) sexual objectification of women. The type of sport watched influenced the role of the mediators: hostile sexism was found to be the common mediator to all sports type, exposure to traditionally considered feminine or neutral sports showed the additional mediation effect of sexual objectification of women. In line with existing literature, controlling for gender showed that the only significant mediators were hostile sexism for male participants and benevolent sexism for female participants. Given the prevalence of men among the viewers of traditionally considered masculine sports, the correlation between television sports exposure and rape myth acceptance through the mediation of hostile sexism is likely due to the gender of the participants. However, this does not apply to the viewers of traditionally considered feminine and neutral sports, as this group is balanced in terms of gender and shows a unique mediation: the correlation between television sports exposure and rape myth acceptance is mediated by both hostile sexism and sexual objectification. Given that hostile sexism is defined as hostility towards women who oppose or fail to conform to traditional gender roles, these findings confirm that sport is perceived as a non-traditional activity for women. Additionally, these results imply that the portrayal of women in traditionally considered feminine and neutral sports - which are defined as such because of their aesthetic characteristics - may have a strong component of sexual objectification of women. The present research contributes to defining the association between sports exposure and rape myth acceptance through the mediation effects of sexist attitudes and sexual objectification of women. The results of this study have practical implications, such as supporting the feminine sports teams who ask for more practical and less revealing uniforms, more similar to their male colleagues and therefore less objectifying.Keywords: television exposure, sport, rape myths, objectification, sexism
Procedia PDF Downloads 102424 Towards Bridging the Gap between the ESP Classroom and the Workplace: Content and Language Needs Analysis in English for an Administrative Studies Course
Authors: Vesna Vulić
Abstract:
Croatia has made large steps forward in the development of higher education over the past 10 years. Purposes and objectives of the tertiary education system are focused on the personal development of young people so that they obtain competences for employment on a flexible labour market. The most frequent tensions between the tertiary institutions and employers are complaints that the current tertiary education system still supplies students with an abundance of theoretical knowledge and not enough practical skills. Polytechnics and schools of professional higher education should deliver professional education and training that will satisfy the needs of their local communities. The 21st century sets demand on undergraduates as well as their lecturers to strive for the highest standards. The skills students acquire during their studies should serve the needs of their future professional careers. In this context, teaching English for Specific Purposes (ESP) presents an enormous challenge for teachers. They have to cope with teaching the language in classes with a large number of students, limitations of time, inadequate equipment and teaching material; most frequently, this leads to focusing on specialist vocabulary neglecting the development of skills and competences required for future employment. Globalization has transformed the labour market and set new standards a perspective employee should meet. When knowledge of languages is considered, new generic skills and competences are required. Not only skillful written and oral communication is needed, but also information, media, and technology literacy, learning skills which include critical and creative thinking, collaborating and communicating, as well as social skills. The aim of this paper is to evaluate the needs of two groups of ESP first year Undergraduate Professional Administrative Study students taking ESP as a mandatory course: 47 first-year Undergraduate Professional Administrative Study students, 21 first-year employed part-time Undergraduate Professional Administrative Study students and 30 graduates with a degree in Undergraduate Professional Administrative Study with various amounts of work experience. The survey adopted a quantitative approach with the aim to determine the differences between the groups in their perception of the four language skills and different areas of law, as well as getting the insight into students' satisfaction with the current course and their motivation for studying ESP. Their perceptions will be compared to the results of the questionnaire conducted among sector professionals in order to examine how they perceive the same elements of the ESP course content and to what extent it fits into their working environment. The results of the survey indicated that there is a strong correlation between acquiring work experience and the level of importance given to particular areas of law studied in an ESP course which is in line with our initial hypothesis. In conclusion, the results of the survey should help lecturers in re-evaluating and updating their ESP course syllabi.Keywords: English for Specific Purposes (ESP), language skills, motivation, needs analysis
Procedia PDF Downloads 301423 Impact of Chess Intervention on Cognitive Functioning of Children
Authors: Ebenezer Joseph
Abstract:
Chess is a useful tool to enhance general and specific cognitive functioning in children. The present study aims to assess the impact of chess on cognitive in children and to measure the differential impact of socio-demographic factors like age and gender of the child on the effectiveness of the chess intervention.This research study used an experimental design to study the impact of the Training in Chess on the intelligence of children. The Pre-test Post-test Control Group Design was utilized. The research design involved two groups of children: an experimental group and a control group. The experimental group consisted of children who participated in the one-year Chess Training Intervention, while the control group participated in extra-curricular activities in school. The main independent variable was training in chess. Other independent variables were gender and age of the child. The dependent variable was the cognitive functioning of the child (as measured by IQ, working memory index, processing speed index, perceptual reasoning index, verbal comprehension index, numerical reasoning, verbal reasoning, non-verbal reasoning, social intelligence, language, conceptual thinking, memory, visual motor and creativity). The sample consisted of 200 children studying in Government and Private schools. Random sampling was utilized. The sample included both boys and girls falling in the age range 6 to 16 years. The experimental group consisted of 100 children (50 from Government schools and 50 from Private schools) with an equal representation of boys and girls. The control group similarly consisted of 100 children. The dependent variables were assessed using Binet-Kamat Test of Intelligence, Wechsler Intelligence Scale for Children - IV (India) and Wallach Kogan Creativity Test. The training methodology comprised Winning Moves Chess Learning Program - Episodes 1–22, lectures with the demonstration board, on-the-board playing and training, chess exercise through workbooks (Chess school 1A, Chess school 2, and tactics) and working with chess software. Further students games were mapped using chess software and the brain patterns of the child were understood. They were taught the ideas behind chess openings and exposure to classical games were also given. The children participated in mock as well as regular tournaments. Preliminary analysis carried out using independent t tests with 50 children indicates that chess training has led to significant increases in the intelligent quotient. Children in the experimental group have shown significant increases in composite scores like working memory and perceptual reasoning. Chess training has significantly enhanced the total creativity scores, line drawing and pattern meaning subscale scores. Systematically learning chess as part of school activities appears to have a broad spectrum of positive outcomes.Keywords: chess, intelligence, creativity, children
Procedia PDF Downloads 258422 Good Governance Complementary to Corruption Abatement: A Cross-Country Analysis
Authors: Kamal Ray, Tapati Bhattacharya
Abstract:
Private use of public office for private gain could be a tentative definition of corruption and most distasteful event of corruption is that it is not there, nor that it is pervasive, but it is socially acknowledged in the global economy, especially in the developing nations. We attempted to assess the interrelationship between the Corruption perception index (CPI) and the principal components of governance indicators as per World Bank like Control of Corruption (CC), rule of law (RL), regulatory quality (RQ) and government effectiveness (GE). Our empirical investigation concentrates upon the degree of reflection of governance indicators upon the CPI in order to single out the most powerful corruption-generating indicator in the selected countries. We have collected time series data on above governance indicators such as CC, RL, RQ and GE of the selected eleven countries from the year of 1996 to 2012 from World Bank data set. The countries are USA, UK, France, Germany, Greece, China, India, Japan, Thailand, Brazil, and South Africa. Corruption Perception Index (CPI) of the countries mentioned above for the period of 1996 to 2012is also collected. Graphical method of simple line diagram against the time series data on CPI is applied for quick view for the relative positions of different trend lines of different nations. The correlation coefficient is enough to assess primarily the degree and direction of association between the variables as we get the numerical data on governance indicators of the selected countries. The tool of Granger Causality Test (1969) is taken into account for investigating causal relationships between the variables, cause and effect to speak of. We do not need to verify stationary test as length of time series is short. Linear regression is taken as a tool for quantification of a change in explained variables due to change in explanatory variable in respect of governance vis a vis corruption. A bilateral positive causal link between CPI and CC is noticed in UK, index-value of CC increases by 1.59 units as CPI increases by one unit and CPI rises by 0.39 units as CC rises by one unit, and hence it has a multiplier effect so far as reduction in corruption is concerned in UK. GE causes strongly to the reduction of corruption in UK. In France, RQ is observed to be a most powerful indicator in reducing corruption whereas it is second most powerful indicator after GE in reducing of corruption in Japan. Governance-indicator like GE plays an important role to push down the corruption in Japan. In China and India, GE is proactive as well as influencing indicator to curb corruption. The inverse relationship between RL and CPI in Thailand indicates that ongoing machineries related to RL is not complementary to the reduction of corruption. The state machineries of CC in S. Africa are highly relevant to reduce the volume of corruption. In Greece, the variations of CPI positively influence the variations of CC and the indicator like GE is effective in controlling corruption as reflected by CPI. All the governance-indicators selected so far have failed to arrest their state level corruptions in USA, Germany and Brazil.Keywords: corruption perception index, governance indicators, granger causality test, regression
Procedia PDF Downloads 306421 Drivers of Satisfaction and Dissatisfaction in Camping Tourism: A Case Study from Croatia
Authors: Darko Prebežac, Josip Mikulić, Maja Šerić, Damir Krešić
Abstract:
Camping tourism is recognized as a growing segment of the broader tourism industry, currently evolving from an inexpensive, temporary sojourn in a rural environment into a highly fragmented niche tourism sector. The trends among public-managed campgrounds seem to be moving away from rustic campgrounds that provide only a tent pad and a fire ring to more developed facilities that offer a range of different amenities, where campers still search for unique experiences that go above the opportunity to experience nature and social interaction. In addition, while camping styles and options changed significantly over the last years, coastal camping in particular became valorized as is it regarded with a heightened sense of nostalgia. Alongside this growing interest in the camping tourism, a demand for quality servicing infrastructure emerged in order to satisfy the wide variety of needs, wants, and expectations of an increasingly demanding traveling public. However, camping activity in general and quality of camping experience and campers’ satisfaction in particular remain an under-researched area of the tourism and consumption behavior literature. In this line, very few studies addressed the issue of quality product/service provision in satisfying nature based tourists and in driving their future behavior with respect to potential re-visitation and recommendation intention. The present study thus aims to investigate the drivers of positive and negative campsite experience using the case of Croatia. Due to the well-preserved nature and indented coastline, camping tourism has a long tradition in Croatia and represents one of the most important and most developed tourism products. During the last decade the number of tourist overnights in Croatian camps has increased by 26% amounting to 16.5 million in 2014. Moreover, according to Eurostat the market share of campsites in the EU is around 14%, indicating that the market share of Croatian campsites is almost double large compared to the EU average. Currently, there are a total of 250 camps in Croatia with approximately 75.8 thousands accommodation units. It is further noteworthy that Croatian camps have higher average occupancy rates and a higher average length of stay as compared to the national average of all types of accommodation. In order to explore the main drivers of positive and negative campsite experiences, this study uses principal components analysis (PCA) and an impact-asymmetry analysis (IAA). Using the PCA, first the main dimensions of the campsite experience are extracted in an exploratory manner. Using the IAA, the extracted factors are investigated for their potentials to create customer delight and/or frustration. The results provide valuable insight to both researchers and practitioners regarding the understanding of campsite satisfaction.Keywords: Camping tourism, campsite, impact-asymmetry analysis, satisfaction
Procedia PDF Downloads 187420 Brief Cognitive Behavior Therapy (BCBT) in a Japanese School Setting: Preliminary Outcomes on a Single Arm Study
Authors: Yuki Matsumoto, Yuma Ishimoto
Abstract:
Cognitive Behavior Therapy (CBT) with children has shown effective application to various problems such as anxiety and depression. Although there are barriers to access to mental health services including lack of professional services in communities and parental concerns about stigma, school has a significant role to address children’s health problems. Schools are regarded as a suitable arena for prevention and early intervention of mental health problems. In this line, CBT can be adaptable to school education and useful to enhance students’ social and emotional skills. However, Japanese school curriculum is rigorous so as to limit available time for implementation of CBT in schools. This paper describes Brief Cognitive Behavior Therapy (BCBT) with children in a Japanese school setting. The program has been developed in order to facilitate acceptability of CBT in schools and aimed to enhance students’ skills to manage anxiety and difficult behaviors. The present research used a single arm design in which 30 students aged 9-10 years old participated. The authors provided teachers a CBT training workshop (two hours) at two primary schools in Tokyo metropolitan area and recruited participants in the research. A homeroom teacher voluntarily delivered a 6-session BCBT program (15 minutes each) in classroom periods which is called as Kaerinokai, a meeting before leaving school. Students completed a questionnaire sheet at pre- and post-periods under the supervision of the teacher. The sheet included the Spence Child Anxiety Scale (SCAS), the Depression Self-Rating Scale for Children (DSRS), and the Strengths and Difficulties Questionnaire (SDQ). The teacher was asked for feedback after the completion. Significant positive changes were found in the total and five of six sub-scales of the SCAS and the total difficulty scale of the SDQ. However, no significant changes were seen in Physical Injury Fear sub-scale of the SCAS, in the DSRS or the Prosocial sub-scale of the SDQ. The effect sizes are mostly between small and medium. The teacher commented that the program was easy to use and found positive changes in classroom activities and personal relationships. This preliminary research showed the feasibility of the BCBT in a school setting. The results suggest that the BCBT offers effective treatment for reduction in anxiety and in difficult behaviors. There is a good prospect of the BCBT suggesting that BCBT may be easier to be delivered than CBT by Japanese teachers to promote child mental health. The study has limitations including no control group, small sample size, or a short teacher training. Future research should address these limitations.Keywords: brief cognitive behavior therapy, cognitive behavior therapy, mental health services in schools, teacher training workshop
Procedia PDF Downloads 336419 Scalable Performance Testing: Facilitating The Assessment Of Application Performance Under Substantial Loads And Mitigating The Risk Of System Failures
Authors: Solanki Ravirajsinh
Abstract:
In the software testing life cycle, failing to conduct thorough performance testing can result in significant losses for an organization due to application crashes and improper behavior under high user loads in production. Simulating large volumes of requests, such as 5 million within 5-10 minutes, is challenging without a scalable performance testing framework. Leveraging cloud services to implement a performance testing framework makes it feasible to handle 5-10 million requests in just 5-10 minutes, helping organizations ensure their applications perform reliably under peak conditions. Implementing a scalable performance testing framework using cloud services and tools like JMeter, EC2 instances (Virtual machine), cloud logs (Monitor errors and logs), EFS (File storage system), and security groups offers several key benefits for organizations. Creating performance test framework using this approach helps optimize resource utilization, effective benchmarking, increased reliability, cost savings by resolving performance issues before the application is released. In performance testing, a master-slave framework facilitates distributed testing across multiple EC2 instances to emulate many concurrent users and efficiently handle high loads. The master node orchestrates the test execution by coordinating with multiple slave nodes to distribute the workload. Slave nodes execute the test scripts provided by the master node, with each node handling a portion of the overall user load and generating requests to the target application or service. By leveraging JMeter's master-slave framework in conjunction with cloud services like EC2 instances, EFS, CloudWatch logs, security groups, and command-line tools, organizations can achieve superior scalability and flexibility in their performance testing efforts. In this master-slave framework, JMeter must be installed on both the master and each slave EC2 instance. The master EC2 instance functions as the "brain," while the slave instances operate as the "body parts." The master directs each slave to execute a specified number of requests. Upon completion of the execution, the slave instances transmit their results back to the master. The master then consolidates these results into a comprehensive report detailing metrics such as the number of requests sent, encountered errors, network latency, response times, server capacity, throughput, and bandwidth. Leveraging cloud services, the framework benefits from automatic scaling based on the volume of requests. Notably, integrating cloud services allows organizations to handle more than 5-10 million requests within 5 minutes, depending on the server capacity of the hosted website or application.Keywords: identify crashes of application under heavy load, JMeter with cloud Services, Scalable performance testing, JMeter master and slave using cloud Services
Procedia PDF Downloads 30418 Antenatal Monitoring of Pre-Eclampsia in a Low Resource Setting
Authors: Alina Rahim, Joanne Moffatt, Jessica Taylor, Joseph Hartland, Tamer Abdelrazik
Abstract:
Background: In 2011, 15% of maternal deaths in Uganda were due to hypertensive disorders (pre-eclampsia and eclampsia). The majority of these deaths are avoidable with optimum antenatal care. The aim of the study was to evaluate how antenatal monitoring of pre-eclampsia was carried out in a low resource setting and to identify barriers to best practice as recommended by the World Health Organisation (WHO) as part of a 4th year medical student External Student Selected component field trip. Method: Women admitted to hospital with pre-eclampsia in rural Uganda (Villa Maria and Kitovu Hospitals) over a year-long period were identified using the maternity register and antenatal record book. It was not possible to obtain notes for all cases identified on the maternity register. Therefore a total of thirty sets of notes were reviewed. The management was recorded and compared to Ugandan National Guidelines and WHO recommendations. Additional qualitative information on routine practice was established by interviewing staff members from the obstetric and midwifery teams. Results: From the records available, all patients in this sample were managed according to WHO recommendations during labour. The rate of Caesarean section as a mode of delivery was noted to be high in this group of patients; 56% at Villa Maria and 46% at Kitovu. Antenatally two WHO recommendations were not routinely met: aspirin prophylaxis and calcium supplementation. This was due to lack of resources, and lack of attendance at antenatal clinic leading to poor detection of high-risk patients. Medical management of pre-eclampsia varied between individual patients, overall 93.3% complied with Ugandan national guidelines. Two patients were treated with diuretics, which is against WHO guidance. Discussion: Antenatal monitoring of pre-eclampsia is important in reducing severe morbidity, long-term disability and mortality amongst mothers and their babies 2 . Poor attendance at antenatal clinic is a barrier to healthcare in low-income countries. Increasing awareness of the importance of these visits for women should be encouraged. The majority of cases reviewed in this sample of women were treated according to Ugandan National Guidelines. It is recommended to commence the use of aspirin prophylaxis for women at high-risk of developing pre-eclampsia and the creation of detailed guidelines for Uganda which would allow for standardisation of care county-wide.Keywords: antenatal monitoring, low resource setting, pre-eclampsia, Uganda
Procedia PDF Downloads 229417 Evaluation of Requests and Outcomes of Magnetic Resonance Imaging Assessing for Cauda Equina Syndrome at a UK Trauma Centre
Authors: Chris Cadman, Marcel Strauss
Abstract:
Background: In 2020, the University Hospital Wishaw in the United Kingdom became the centre for trauma and orthopaedics within its health board. This resulted in the majority of patients with suspected cauda equina syndrome (CES) being assessed and imaged at this site, putting an increased demand on MR imaging and displacing other previous activity. Following this transition, imaging requests for CES did not always follow national guidelines and would often be missing important clinical and safety information. There also appeared to be a very low positive scan rate compared with previously reported studies. In an attempt to improve patient selection and reduce the burden of CES imaging at this site clinical audit was performed. Methods: A total of 250 consecutive patients imaged to assess for CES were evaluated. Patients had to have presented to either the emergency or orthopaedic department acutely with a presenting complaint of suspected CES. Patients were excluded if they were not admitted acutely or were assessed by other clinical specialities. In total, 233 patients were included. Requests were assessed for appropriate clinical history, accurate and complete clinical assessment and MRI safety information. Clinical assessment was allocated a score of 1-6 based on information relating to history of pain, level of pain, dermatomes/myotomes affected, peri-anal paraesthesia/anaesthesia, anal tone and post-void bladder volume with each element scoring one point. Images were assessed for positive findings of CES, acquired spinal stenosis or nerve root compression. Results: Overall, 73% of requests had a clear clinical history of CES. The urgency of the request for imaging was given in 23% of cases. The mean clinical assessment score was 3.7 out of a total of 6. Overall, 2% of scans were positive for CES, 29% had acquired spinal stenosis and 30% had nerve root compression. For patients with CES, 75% had acute neurological signs compared with 68% of the study population. CES patients had a mean clinical history score of 5.3 compared with 3.7 for the study population. Overall, 95% of requests had appropriate MRI safety information. Discussion: it study included 233 patients who underwent specialist assessment and referral for MR imaging for suspected CES. Despite the serious nature of this condition, a large proportion of imaging requests did not have a clear clinical query of CES and the level of urgency was not given, which could potentially lead to a delay in imaging and treatment. Clinical examination was often also incomplete, which can make triaging of patients presenting with similar symptoms challenging. The positive rate for CES was only 2%, much below other studies which had positive rates of 6–40% with a large meta-analysis finding a mean positive rate of 19%. These findings demonstrate an opportunity to improve the quality of imaging requests for suspected CES. This may help to improve patient selection for imaging and result in a positive rate for CES imaging that is more in line with other centres.Keywords: cauda equina syndrome, acute back pain, MRI, spine
Procedia PDF Downloads 15416 Safety Tolerance Zone for Driver-Vehicle-Environment Interactions under Challenging Conditions
Authors: Matjaž Šraml, Marko Renčelj, Tomaž Tollazzi, Chiara Gruden
Abstract:
Road safety is a worldwide issue with numerous and heterogeneous factors influencing it. On the side, driver state – comprising distraction/inattention, fatigue, drowsiness, extreme emotions, and socio-cultural factors highly affect road safety. On the other side, the vehicle state has an important role in mitigating (or not) the road risk. Finally, the road environment is still one of the main determinants of road safety, defining driving task complexity. At the same time, thanks to technological development, a lot of detailed data is easily available, creating opportunities for the detection of driver state, vehicle characteristics and road conditions and, consequently, for the design of ad hoc interventions aimed at improving driver performance, increase awareness and mitigate road risks. This is the challenge faced by the i-DREAMS project. i-DREAMS, which stands for a smart Driver and Road Environment Assessment and Monitoring System, is a 3-year project funded by the European Union’s Horizon 2020 research and innovation program. It aims to set up a platform to define, develop, test and validate a ‘Safety Tolerance Zone’ to prevent drivers from getting too close to the boundaries of unsafe operation by mitigating risks in real-time and after the trip. After the definition and development of the Safety Tolerance Zone concept and the concretization of the same in an Advanced driver-assistance system (ADAS) platform, the system was tested firstly for 2 months in a driving simulator environment in 5 different countries. After that, naturalistic driving studies started for a 10-month period (comprising a 1-month pilot study, 3-month baseline study and 6 months study implementing interventions). Currently, the project team has approved a common evaluation approach, and it is developing the assessment of the usage and outcomes of the i-DREAMS system, which is turning positive insights. The i-DREAMS consortium consists of 13 partners, 7 engineering universities and research groups, 4 industry partners and 2 partners (European Transport Safety Council - ETSC - and POLIS cities and regions for transport innovation) closely linked to transport safety stakeholders, covering 8 different countries altogether.Keywords: advanced driver assistant systems, driving simulator, safety tolerance zone, traffic safety
Procedia PDF Downloads 68415 Remote Sensing-Based Prediction of Asymptomatic Rice Blast Disease Using Hyperspectral Spectroradiometry and Spectral Sensitivity Analysis
Authors: Selvaprakash Ramalingam, Rabi N. Sahoo, Dharmendra Saraswat, A. Kumar, Rajeev Ranjan, Joydeep Mukerjee, Viswanathan Chinnasamy, K. K. Chaturvedi, Sanjeev Kumar
Abstract:
Rice is one of the most important staple food crops in the world. Among the various diseases that affect rice crops, rice blast is particularly significant, causing crop yield and economic losses. While the plant has defense mechanisms in place, such as chemical indicators (proteins, salicylic acid, jasmonic acid, ethylene, and azelaic acid) and resistance genes in certain varieties that can protect against diseases, susceptible varieties remain vulnerable to these fungal diseases. Early prediction of rice blast (RB) disease is crucial, but conventional techniques for early prediction are time-consuming and labor-intensive. Hyperspectral remote sensing techniques hold the potential to predict RB disease at its asymptomatic stage. In this study, we aimed to demonstrate the prediction of RB disease at the asymptomatic stage using non-imaging hyperspectral ASD spectroradiometer under controlled laboratory conditions. We applied statistical spectral discrimination theory to identify unknown spectra of M. Oryzae, the fungus responsible for rice blast disease. The infrared (IR) region was found to be significantly affected by RB disease. These changes may result in alterations in the absorption, reflection, or emission of infrared radiation by the affected plant tissues. Our research revealed that the protein spectrum in the IR region is impacted by RB disease. In our study, we identified strong correlations in the region (Amide group - I) around X 1064 nm and Y 1300 nm with the Lambda / Lambda derived spectra methods for protein detection. During the stages when the disease is developing, typically from day 3 to day 5, the plant's defense mechanisms are not as effective. This is especially true for the PB-1 variety of rice, which is highly susceptible to rice blast disease. Consequently, the proteins in the plant are adversely affected during this critical time. The spectral contour plot reveals the highly correlated spectral regions 1064 nm and Y 1300 nm associated with RB disease infection. Based on these spectral sensitivities, we developed new spectral disease indices for predicting different stages of disease emergence. The goal of this research is to lay the foundation for future UAV and satellite-based studies aimed at long-term monitoring of RB disease.Keywords: rice blast, asymptomatic stage, spectral sensitivity, IR
Procedia PDF Downloads 87414 Decomposition of the Discount Function Into Impatience and Uncertainty Aversion. How Neurofinance Can Help to Understand Behavioral Anomalies
Authors: Roberta Martino, Viviana Ventre
Abstract:
Intertemporal choices are choices under conditions of uncertainty in which the consequences are distributed over time. The Discounted Utility Model is the essential reference for describing the individual in the context of intertemporal choice. The model is based on the idea that the individual selects the alternative with the highest utility, which is calculated by multiplying the cardinal utility of the outcome, as if the reception were instantaneous, by the discount function that determines a decrease in the utility value according to how the actual reception of the outcome is far away from the moment the choice is made. Initially, the discount function was assumed to have an exponential trend, whose decrease over time is constant, in line with a profile of a rational investor described by classical economics. Instead, empirical evidence called for the formulation of alternative, hyperbolic models that better represented the actual actions of the investor. Attitudes that do not comply with the principles of classical rationality are termed anomalous, i.e., difficult to rationalize and describe through normative models. The development of behavioral finance, which describes investor behavior through cognitive psychology, has shown that deviations from rationality are due to the limited rationality condition of human beings. What this means is that when a choice is made in a very difficult and information-rich environment, the brain does a compromise job between the cognitive effort required and the selection of an alternative. Moreover, the evaluation and selection phase of the alternative, the collection and processing of information, are dynamics conditioned by systematic distortions of the decision-making process that are the behavioral biases involving the individual's emotional and cognitive system. In this paper we present an original decomposition of the discount function to investigate the psychological principles of hyperbolic discounting. It is possible to decompose the curve into two components: the first component is responsible for the smaller decrease in the outcome as time increases and is related to the individual's impatience; the second component relates to the change in the direction of the tangent vector to the curve and indicates how much the individual perceives the indeterminacy of the future indicating his or her aversion to uncertainty. This decomposition allows interesting conclusions to be drawn with respect to the concept of impatience and the emotional drives involved in decision-making. The contribution that neuroscience can make to decision theory and inter-temporal choice theory is vast as it would allow the description of the decision-making process as the relationship between the individual's emotional and cognitive factors. Neurofinance is a discipline that uses a multidisciplinary approach to investigate how the brain influences decision-making. Indeed, considering that the decision-making process is linked to the activity of the prefrontal cortex and amygdala, neurofinance can help determine the extent to which abnormal attitudes respect the principles of rationality.Keywords: impatience, intertemporal choice, neurofinance, rationality, uncertainty
Procedia PDF Downloads 130413 A Study on the Quantitative Evaluation Method of Asphalt Pavement Condition through the Visual Investigation
Authors: Sungho Kim, Jaechoul Shin, Yujin Baek
Abstract:
In recent years, due to the environmental impacts and time factor, etc., various type of pavement deterioration is increasing rapidly such as crack, pothole, rutting and roughness degradation. The Ministry of Land, Infrastructure and Transport maintains regular pavement condition of the highway and the national highway using the pavement condition survey equipment and structural survey equipment in Korea. Local governments that maintain local roads, farm roads, etc. are difficult to maintain the pavement condition using the pavement condition survey equipment depending on economic conditions, skills shortages and local conditions such as narrow roads. This study presents a quantitative evaluation method of the pavement condition through the visual inspection to overcome these problems of roads managed by local governments. It is difficult to evaluate rutting and roughness with the naked eye. However, the condition of cracks can be evaluated with the naked eye. Linear cracks (m), area cracks (m²) and potholes (number, m²) were investigated with the naked eye every 100 meters for survey the cracks. In this paper, crack ratio was calculated using the results of the condition of cracks and pavement condition was evaluated by calculated crack ratio. The pavement condition survey equipment also investigated the pavement condition in the same section in order to evaluate the reliability of pavement condition evaluation by the calculated crack ratio. The pavement condition was evaluated through the SPI (Seoul Pavement Index) and calculated crack ratio using results of field survey. The results of a comparison between 'the SPI considering only crack ratio' and 'the SPI considering rutting and roughness either' using the equipment survey data showed a margin of error below 5% when the SPI is less than 5. The SPI 5 is considered the base point to determine whether to maintain the pavement condition. It showed that the pavement condition can be evaluated using only the crack ratio. According to the analysis results of the crack ratio between the visual inspection and the equipment survey, it has an average error of 1.86%(minimum 0.03%, maximum 9.58%). Economically, the visual inspection costs only 10% of the equipment survey and will also help the economy by creating new jobs. This paper advises that local governments maintain the pavement condition through the visual investigations. However, more research is needed to improve reliability. Acknowledgment: The author would like to thank the MOLIT (Ministry of Land, Infrastructure, and Transport). This work was carried out through the project funded by the MOLIT. The project name is 'development of 20mm grade for road surface detecting roadway condition and rapid detection automation system for removal of pothole'.Keywords: asphalt pavement maintenance, crack ratio, evaluation of asphalt pavement condition, SPI (Seoul Pavement Index), visual investigation
Procedia PDF Downloads 167412 Bioanalytical Method Development and Validation of Aminophylline in Rat Plasma Using Reverse Phase High Performance Liquid Chromatography: An Application to Preclinical Pharmacokinetics
Authors: S. G. Vasantharaju, Viswanath Guptha, Raghavendra Shetty
Abstract:
Introduction: Aminophylline is a methylxanthine derivative belonging to the class bronchodilator. From the literature survey, reported methods reveals the solid phase extraction and liquid liquid extraction which is highly variable, time consuming, costly and laborious analysis. Present work aims to develop a simple, highly sensitive, precise and accurate high-performance liquid chromatography method for the quantification of Aminophylline in rat plasma samples which can be utilized for preclinical studies. Method: Reverse Phase high-performance liquid chromatography method. Results: Selectivity: Aminophylline and the internal standard were well separated from the co-eluted components and there was no interference from the endogenous material at the retention time of analyte and the internal standard. The LLOQ measurable with acceptable accuracy and precision for the analyte was 0.5 µg/mL. Linearity: The developed and validated method is linear over the range of 0.5-40.0 µg/mL. The coefficient of determination was found to be greater than 0.9967, indicating the linearity of this method. Accuracy and precision: The accuracy and precision values for intra and inter day studies at low, medium and high quality control samples concentrations of aminophylline in the plasma were within the acceptable limits Extraction recovery: The method produced consistent extraction recovery at all 3 QC levels. The mean extraction recovery of aminophylline was 93.57 ± 1.28% while that of internal standard was 90.70 ± 1.30%. Stability: The results show that aminophylline is stable in rat plasma under the studied stability conditions and that it is also stable for about 30 days when stored at -80˚C. Pharmacokinetic studies: The method was successfully applied to the quantitative estimation of aminophylline rat plasma following its oral administration to rats. Discussion: Preclinical studies require a rapid and sensitive method for estimating the drug concentration in the rat plasma. The method described in our article includes a simple protein precipitation extraction technique with ultraviolet detection for quantification. The present method is simple and robust for fast high-throughput sample analysis with less analysis cost for analyzing aminophylline in biological samples. In this proposed method, no interfering peaks were observed at the elution times of aminophylline and the internal standard. The method also had sufficient selectivity, specificity, precision and accuracy over the concentration range of 0.5 - 40.0 µg/mL. An isocratic separation technique was used underlining the simplicity of the presented method.Keywords: Aminophyllin, preclinical pharmacokinetics, rat plasma, RPHPLC
Procedia PDF Downloads 223411 Extraction of Urban Building Damage Using Spectral, Height and Corner Information
Authors: X. Wang
Abstract:
Timely and accurate information on urban building damage caused by earthquake is important basis for disaster assessment and emergency relief. Very high resolution (VHR) remotely sensed imagery containing abundant fine-scale information offers a large quantity of data for detecting and assessing urban building damage in the aftermath of earthquake disasters. However, the accuracy obtained using spectral features alone is comparatively low, since building damage, intact buildings and pavements are spectrally similar. Therefore, it is of great significance to detect urban building damage effectively using multi-source data. Considering that in general height or geometric structure of buildings change dramatically in the devastated areas, a novel multi-stage urban building damage detection method, using bi-temporal spectral, height and corner information, was proposed in this study. The pre-event height information was generated using stereo VHR images acquired from two different satellites, while the post-event height information was produced from airborne LiDAR data. The corner information was extracted from pre- and post-event panchromatic images. The proposed method can be summarized as follows. To reduce the classification errors caused by spectral similarity and errors in extracting height information, ground surface, shadows, and vegetation were first extracted using the post-event VHR image and height data and were masked out. Two different types of building damage were then extracted from the remaining areas: the height difference between pre- and post-event was used for detecting building damage showing significant height change; the difference in the density of corners between pre- and post-event was used for extracting building damage showing drastic change in geometric structure. The initial building damage result was generated by combining above two building damage results. Finally, a post-processing procedure was adopted to refine the obtained initial result. The proposed method was quantitatively evaluated and compared to two existing methods in Port au Prince, Haiti, which was heavily hit by an earthquake in January 2010, using pre-event GeoEye-1 image, pre-event WorldView-2 image, post-event QuickBird image and post-event LiDAR data. The results showed that the method proposed in this study significantly outperformed the two comparative methods in terms of urban building damage extraction accuracy. The proposed method provides a fast and reliable method to detect urban building collapse, which is also applicable to relevant applications.Keywords: building damage, corner, earthquake, height, very high resolution (VHR)
Procedia PDF Downloads 213410 3D Interactions in Under Water Acoustic Simulations
Authors: Prabu Duplex
Abstract:
Due to stringent emission regulation targets, large-scale transition to renewable energy sources is a global challenge, and wind power plays a significant role in the solution vector. This scenario has led to the construction of offshore wind farms, and several wind farms are planned in the shallow waters where the marine habitat exists. It raises concerns over impacts of underwater noise on marine species, for example bridge constructions in the ocean straits. Dangerous to aquatic life, the environmental organisations say, the bridge would be devastating, since ocean straits are important place of transit for marine mammals. One of the highest concentrations of biodiversity in the world is concentrated these areas. The investigation of ship noise and piling noise that may happen during bridge construction and in operation is therefore vital. Once the source levels are known the receiver levels can be modelled. With this objective this work investigates the key requirement of the software that can model transmission loss in high frequencies that may occur during construction or operation phases. Most propagation models are 2D solutions, calculating the propagation loss along a transect, which does not include horizontal refraction, reflection or diffraction. In many cases, such models provide sufficient accuracy and can provide three-dimensional maps by combining, through interpolation, several two-dimensional (distance and depth) transects. However, in some instances the use of 2D models may not be sufficient to accurately model the sound propagation. A possible example includes a scenario where an island or land mass is situated between the source and receiver. The 2D model will result in a shadow behind the land mass where the modelled transects intersect the land mass. Diffraction will occur causing bending of the sound around the land mass. In such cases, it may be necessary to use a 3D model, which accounts for horizontal diffraction to accurately represent the sound field. Other scenarios where 2D models may not provide sufficient accuracy may be environments characterised by a strong up-sloping or down sloping seabed, such as propagation around continental shelves. In line with these objectives by means of a case study, this work addresses the importance of 3D interactions in underwater acoustics. The methodology used in this study can also be used for other 3D underwater sound propagation studies. This work assumes special significance given the increasing interest in using underwater acoustic modeling for environmental impacts assessments. Future work also includes inter-model comparison in shallow water environments considering more physical processes known to influence sound propagation, such as scattering from the sea surface. Passive acoustic monitoring of the underwater soundscape with distributed hydrophone arrays is also suggested to investigate the 3D propagation effects as discussed in this article.Keywords: underwater acoustics, naval, maritime, cetaceans
Procedia PDF Downloads 20409 Promoting 'One Health' Surveillance and Response Approach Implementation Capabilities against Emerging Threats and Epidemics Crisis Impact in African Countries
Authors: Ernest Tambo, Ghislaine Madjou, Jeanne Y. Ngogang, Shenglan Tang, Zhou XiaoNong
Abstract:
Implementing national to community-based 'One Health' surveillance approach for human, animal and environmental consequences mitigation offers great opportunities and value-added in sustainable development and wellbeing. 'One Health' surveillance approach global partnerships, policy commitment and financial investment are much needed in addressing the evolving threats and epidemics crises mitigation in African countries. The paper provides insights onto how China-Africa health development cooperation in promoting “One Health” surveillance approach in response advocacy and mitigation. China-Africa health development initiatives provide new prospects in guiding and moving forward appropriate and evidence-based advocacy and mitigation management approaches and strategies in attaining Universal Health Coverage (UHC) and Sustainable Development Goals (SDGs). Early and continuous quality and timely surveillance data collection and coordinated information sharing practices in malaria and other diseases are demonstrated in Comoros, Zanzibar, Ghana and Cameroon. Improvements of variety of access to contextual sources and network of data sharing platforms are needed in guiding evidence-based and tailored detection and response to unusual hazardous events. Moreover, understanding threats and diseases trends, frontline or point of care response delivery is crucial to promote integrated and sustainable targeted local, national “One Health” surveillance and response approach needs implementation. Importantly, operational guidelines are vital in increasing coherent financing and national workforce capacity development mechanisms. Strengthening participatory partnerships, collaboration and monitoring strategies in achieving global health agenda effectiveness in Africa. At the same enhancing surveillance data information streams reporting and dissemination usefulness in informing policies decisions, health systems programming and financial mobilization and prioritized allocation pre, during and post threats and epidemics crises programs strengths and weaknesses. Thus, capitalizing on “One Health” surveillance and response approach advocacy and mitigation implementation is timely in consolidating Africa Union 2063 agenda and Africa renaissance capabilities and expectations.Keywords: Africa, one health approach, surveillance, response
Procedia PDF Downloads 422408 Using Machine Learning to Extract Patient Data from Non-standardized Sports Medicine Physician Notes
Authors: Thomas Q. Pan, Anika Basu, Chamith S. Rajapakse
Abstract:
Machine learning requires data that is categorized into features that models train on. This topic is important to the field of sports medicine due to the many tools it provides to physicians such as diagnosis support and risk assessment. Physician note that healthcare professionals take are usually unclean and not suitable for model training. The objective of this study was to develop and evaluate an advanced approach for extracting key features from sports medicine data without the need for extensive model training or data labeling. An LLM (Large Language Model) was given a narrative (Physician’s Notes) and prompted to extract four features (details about the patient). The narrative was found in a datasheet that contained six columns: Case Number, Validation Age, Validation Gender, Validation Diagnosis, Validation Body Part, and Narrative. The validation columns represent the accurate responses that the LLM attempts to output. With the given narrative, the LLM would output its response and extract the age, gender, diagnosis, and injured body part with each category taking up one line. The output would then be cleaned, matched, and added to new columns containing the extracted responses. Five ways of checking the accuracy were used: unclear count, substring comparison, LLM comparison, LLM re-check, and hand-evaluation. The unclear count essentially represented the extractions the LLM missed. This can be also understood as the recall score ([total - false negatives] over total). The rest of these correspond to the precision score ([total - false positives] over total). Substring comparison evaluated the validation (X) and extracted (Y) columns’ likeness by checking if X’s results were a substring of Y's findings and vice versa. LLM comparison directly asked an LLM if the X and Y’s results were similar. LLM Re-check prompted the LLM to see if the extracted results can be found in the narrative. Lastly, A selection of 1,000 random narratives was also selected and hand-evaluated to give an estimate of how well the LLM-based feature extraction model performed. With a selection of 10,000 narratives, the LLM-based approach had a recall score of roughly 98%. However, the precision scores of the substring comparison and LLM comparison models were around 72% and 76% respectively. The reason for these low figures is due to the minute differences between answers. For example, the ‘chest’ is a part of the ‘upper trunk’ however, these models cannot detect that. On the other hand, the LLM re-check and subset of hand-tested narratives showed a precision score of 96% and 95%. If this subset is used to extrapolate the possible outcome of the whole 10,000 narratives, the LLM-based approach would be strong in both precision and recall. These results indicated that an LLM-based feature extraction model could be a useful way for medical data in sports to be collected and analyzed by machine learning models. Wide use of this method could potentially increase the availability of data thus improving machine learning algorithms and supporting doctors with more enhanced tools. Procedia PDF Downloads 12407 The Role of Structural Poverty in the Know-How and Moral Economy of Doctors in Africa: An Anthropological Perspective
Authors: Isabelle Gobatto
Abstract:
Based on an anthropological approach, this paper explores the medical profession and the construction of medical practices by considering the multiform articulations between structural poverty and the production of care from a low-resource francophone West African country, Burkina Faso. This country is considered in its exemplary dimension of culturally differentiated countries of the African continent that share the same situation of structural poverty. The objective is to expose the effects of structural poverty on the ways of constructing professional knowledge and thinking about the sense of the medical profession. If doctors are trained to have the same capacities in South and West countries, which are to treat and save lives whatever the cultural contexts of the practice of medicine, the ways of investing their role and of dealing with this context of action fracture the homogenization of the medical profession. In the line of anthropology of biomedicine, this paper outlines the complex effects of structural poverty on health care, care relations, and the moral economy of doctors. The materials analyzed are based on an ethnography including two temporalities located thirty years apart (1990-1994 and 2020-2021), based on long-term observations of care practices conducted in healthcare institutions, interviews coupled with the life histories of physicians. The findings reveal that disabilities faced by doctors to deliver care are interpreted as policy gaps, but they are also considered by physicians as constitutive of the social and cultural characteristics of patients, making their capacities and incapacities in terms of accompanying caregivers in the production of care. These perceptions have effects on know-how, structured around the need to act even when diagnoses are not made so as not to see patients desert health structures if the costs of care are too high for them. But these interpretations of highly individualizing dimensions of these difficulties place part of the blame on patients for the difficulties in using learned knowledge and delivering effective care. These situations challenge the ethics of caregivers but also of ethnologists. Firstly because the interpretations of disabilities prevent caregivers from considering vulnerabilities of care as constituting a common condition shared with their patients in these health systems, affecting them in an identical way although in different places in the production of care. Correlatively, these results underline that these professional conceptions prevent the emergence of a figure of victim, which could be shared between patients and caregivers who, together, undergo working and care conditions at the limit of the acceptable. This dimension directly involves politics. Secondly, structural poverty and its effects on care challenge the ethics of the anthropologist who observes caregivers producing, without intent to arm, experiences of care marked by an ordinary violence, by not giving them the care they need. It is worth asking how anthropologists could get doctors to think in this light in west-African societies.Keywords: Africa, care, ethics, poverty
Procedia PDF Downloads 69406 Threats to the Business Value: The Case of Mechanical Engineering Companies in the Czech Republic
Authors: Maria Reznakova, Michala Strnadova, Lukas Reznak
Abstract:
Successful achievement of strategic goals requires an effective performance management system, i.e. determining the appropriate indicators measuring the rate of goal achievement. Assuming that the goal of the owners is to grow the assets they invested in, it is vital to identify the key performance indicators, which contribute to value creation. These indicators are known as value drivers. Based on the undertaken literature search, a value driver is defined as any factor that affects the value of an enterprise. The important factors are then monitored by both financial and non-financial indicators. Financial performance indicators are most useful in strategic management, since they indicate whether a company's strategy implementation and execution are contributing to bottom line improvement. Non-financial indicators are mainly used for short-term decisions. The identification of value drivers, however, is problematic for companies which are not publicly traded. Therefore financial ratios continue to be used to measure the performance of companies, despite their considerable criticism. The main drawback of such indicators is the fact that they are calculated based on accounting data, while accounting rules may differ considerably across different environments. For successful enterprise performance management it is vital to avoid factors that may reduce (or even destroy) its value. Among the known factors reducing the enterprise value are the lack of capital, lack of strategic management system and poor quality of production. In order to gain further insight into the topic, the paper presents results of the research identifying factors that adversely affect the performance of mechanical engineering enterprises in the Czech Republic. The research methodology focuses on both the qualitative and the quantitative aspect of the topic. The qualitative data were obtained from a questionnaire survey of the enterprises senior management, while the quantitative financial data were obtained from the Analysis Major Database for European Sources (AMADEUS). The questionnaire prompted managers to list factors which negatively affect business performance of their enterprises. The range of potential factors was based on a secondary research – analysis of previously undertaken questionnaire surveys and research of studies published in the scientific literature. The results of the survey were evaluated both in general, by average scores, and by detailed sub-analyses of additional criteria. These include the company specific characteristics, such as its size and ownership structure. The evaluation also included a comparison of the managers’ opinions and the performance of their enterprises – measured by return on equity and return on assets ratios. The comparisons were tested by a series of non-parametric tests of statistical significance. The results of the analyses show that the factors most detrimental to the enterprise performance include the incompetence of responsible employees and the disregard to the customers‘ requirements.Keywords: business value, financial ratios, performance measurement, value drivers
Procedia PDF Downloads 224405 Preparation of Metallic Nanoparticles with the Use of Reagents of Natural Origin
Authors: Anna Drabczyk, Sonia Kudlacik-Kramarczyk, Dagmara Malina, Bozena Tyliszczak, Agnieszka Sobczak-Kupiec
Abstract:
Nowadays, nano-size materials are very popular group of materials among scientists. What is more, these materials find an application in a wide range of various areas. Therefore constantly increasing demand for nanomaterials including metallic nanoparticles such as silver of gold ones is observed. Therefore, new routes of their preparation are sought. Considering potential application of nanoparticles, it is important to select an adequate methodology of their preparation because it determines their size and shape. Among the most commonly applied methods of preparation of nanoparticles chemical and electrochemical techniques are leading. However, currently growing attention is directed into the biological or biochemical aspects of syntheses of metallic nanoparticles. This is associated with a trend of developing of new routes of preparation of given compounds according to the principles of green chemistry. These principles involve e.g. the reduction of the use of toxic compounds in the synthesis as well as the reduction of the energy demand or minimization of the generated waste. As a result, a growing popularity of the use of such components as natural plant extracts, infusions or essential oils is observed. Such natural substances may be used both as a reducing agent of metal ions and as a stabilizing agent of formed nanoparticles therefore they can replace synthetic compounds previously used for the reduction of metal ions or for the stabilization of obtained nanoparticles suspension. Methods that proceed in the presence of previously mentioned natural compounds are environmentally friendly and proceed without the application of any toxic reagents. Methodology: Presented research involves preparation of silver nanoparticles using selected plant extracts, e.g. artichoke extract. Extracts of natural origin were used as reducing and stabilizing agents at the same time. Furthermore, syntheses were carried out in the presence of additional polymeric stabilizing agent. Next, such features of obtained suspensions of nanoparticles as total antioxidant activity as well as content of phenolic compounds have been characterized. First of the mentioned studies involved the reaction with DPPH (2,2-Diphenyl-1-picrylhydrazyl) radical. The content of phenolic compounds was determined using Folin-Ciocalteu technique. Furthermore, an essential issue was also the determining of the stability of formed suspensions of nanoparticles. Conclusions: In the research it was demonstrated that metallic nanoparticles may be obtained using plant extracts or infusions as stabilizing or reducing agent. The methodology applied, i.e. a type of plant extract used during the synthesis, had an impact on the content of phenolic compounds as well as on the size and polydispersity of obtained nanoparticles. What is more, it is possible to prepare nano-size particles that will be characterized by properties desirable from the viewpoint of their potential application and such an effect may be achieved with the use of non-toxic reagents of natural origin. Furthermore, proposed methodology stays in line with the principles of green chemistry.Keywords: green chemistry principles, metallic nanoparticles, plant extracts, stabilization of nanoparticles
Procedia PDF Downloads 125404 Hand Motion Tracking as a Human Computer Interation for People with Cerebral Palsy
Authors: Ana Teixeira, Joao Orvalho
Abstract:
This paper describes experiments using Scratch games, to check the feasibility of employing cerebral palsy users gestures as an alternative of interaction with a computer carried out by students of Master Human Computer Interaction (HCI) of IPC Coimbra. The main focus of this work is to study the usability of a Web Camera as a motion tracking device to achieve a virtual human-computer interaction used by individuals with CP. An approach for Human-computer Interaction (HCI) is present, where individuals with cerebral palsy react and interact with a scratch game through the use of a webcam as an external interaction device. Motion tracking interaction is an emerging technology that is becoming more useful, effective and affordable. However, it raises new questions from the HCI viewpoint, for example, which environments are most suitable for interaction by users with disabilities. In our case, we put emphasis on the accessibility and usability aspects of such interaction devices to meet the special needs of people with disabilities, and specifically people with CP. Despite the fact that our work has just started, preliminary results show that, in general, computer vision interaction systems are very useful; in some cases, these systems are the only way by which some people can interact with a computer. The purpose of the experiments was to verify two hypothesis: 1) people with cerebral palsy can interact with a computer using their natural gestures, 2) scratch games can be a research tool in experiments with disabled young people. A game in Scratch with three levels is created to be played through the use of a webcam. This device permits the detection of certain key points of the user’s body, which allows to assume the head, arms and specially the hands as the most important aspects of recognition. Tests with 5 individuals of different age and gender were made throughout 3 days through periods of 30 minutes with each participant. For a more extensive and reliable statistical analysis, the number of both participants and repetitions in further investigations should be increased. However, already at this stage of research, it is possible to draw some conclusions. First, and the most important, is that simple scratch games on the computer can be a research tool that allows investigating the interaction with computer performed by young persons with CP using intentional gestures. Measurements performed with the assistance of games are attractive for young disabled users. The second important conclusion is that they are able to play scratch games using their gestures. Therefore, the proposed interaction method is promising for them as a human-computer interface. In the future, we plan to include the development of multimodal interfaces that combine various computer vision devices with other input devices improvements in the existing systems to accommodate more the special needs of individuals, in addition, to perform experiments on a larger number of participants.Keywords: motion tracking, cerebral palsy, rehabilitation, HCI
Procedia PDF Downloads 235403 A Grid Synchronization Method Based On Adaptive Notch Filter for SPV System with Modified MPPT
Authors: Priyanka Chaudhary, M. Rizwan
Abstract:
This paper presents a grid synchronization technique based on adaptive notch filter for SPV (Solar Photovoltaic) system along with MPPT (Maximum Power Point Tracking) techniques. An efficient grid synchronization technique offers proficient detection of various components of grid signal like phase and frequency. It also acts as a barrier for harmonics and other disturbances in grid signal. A reference phase signal synchronized with the grid voltage is provided by the grid synchronization technique to standardize the system with grid codes and power quality standards. Hence, grid synchronization unit plays important role for grid connected SPV systems. As the output of the PV array is fluctuating in nature with the meteorological parameters like irradiance, temperature, wind etc. In order to maintain a constant DC voltage at VSC (Voltage Source Converter) input, MPPT control is required to track the maximum power point from PV array. In this work, a variable step size P & O (Perturb and Observe) MPPT technique with DC/DC boost converter has been used at first stage of the system. This algorithm divides the dPpv/dVpv curve of PV panel into three separate zones i.e. zone 0, zone 1 and zone 2. A fine value of tracking step size is used in zone 0 while zone 1 and zone 2 requires a large value of step size in order to obtain a high tracking speed. Further, adaptive notch filter based control technique is proposed for VSC in PV generation system. Adaptive notch filter (ANF) approach is used to synchronize the interfaced PV system with grid to maintain the amplitude, phase and frequency parameters as well as power quality improvement. This technique offers the compensation of harmonics current and reactive power with both linear and nonlinear loads. To maintain constant DC link voltage a PI controller is also implemented and presented in this paper. The complete system has been designed, developed and simulated using SimPower System and Simulink toolbox of MATLAB. The performance analysis of three phase grid connected solar photovoltaic system has been carried out on the basis of various parameters like PV output power, PV voltage, PV current, DC link voltage, PCC (Point of Common Coupling) voltage, grid voltage, grid current, voltage source converter current, power supplied by the voltage source converter etc. The results obtained from the proposed system are found satisfactory.Keywords: solar photovoltaic systems, MPPT, voltage source converter, grid synchronization technique
Procedia PDF Downloads 594