Search results for: feature selection
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3496

Search results for: feature selection

916 A Systematic Review on Measuring the Physical Activity Level and Pattern in Persons with Chronic Fatigue Syndrome

Authors: Kuni Vergauwen, Ivan P. J. Huijnen, Astrid Depuydt, Jasmine Van Regenmortel, Mira Meeus

Abstract:

A lower activity level and imbalanced activity pattern are frequently observed in persons with chronic fatigue syndrome (CFS) / myalgic encephalomyelitis (ME) due to debilitating fatigue and post-exertional malaise (PEM). Identification of measurement instruments to evaluate the activity level and pattern is therefore important. The objective is to identify measurement instruments suited to evaluate the activity level and/or pattern in patients with CFS/ME and review their psychometric properties. A systematic literature search was performed in the electronic databases PubMed and Web of Science until 12 October 2016. Articles including relevant measurement instruments were identified and included for further analysis. The psychometric properties of relevant measurement instruments were extracted from the included articles and rated based on the COnsensus-based Standards for the selection of health Measurement INstruments (COSMIN) checklist. The review was performed and reported according to the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) statement. A total of 49 articles and 15 unique measurement instruments were found, but only three instruments were evaluated in patients with CFS/ME: the Chronic Fatigue Syndrome-Activity Questionnaire (CFS-AQ), Activity Pattern Interview (API) and International Physical Activity Questionnaire-Short Form (IPAQ-SF), three self-report instruments measuring the physical activity level. The IPAQ-SF, CFS-AQ and API are all equally capable of evaluating the physical activity level, but none of the three measurement instruments are optimal to use. No studies about the psychometric properties of activity monitors in patients with CFS/ME were found, although they are often used as the gold standard to measure the physical activity pattern. More research is needed to evaluate the psychometric properties of existing instruments, including the use of activity monitors.

Keywords: chronic fatigue syndrome, data collection, physical activity, psychometrics

Procedia PDF Downloads 211
915 Pharmacogenetics of P2Y12 Receptor Inhibitors

Authors: Ragy Raafat Gaber Attaalla

Abstract:

For cardiovascular illness, oral P2Y12 inhibitors including clopidogrel, prasugrel, and ticagrelor are frequently recommended. Each of these medications has advantages and disadvantages. In the absence of genotyping, it has been demonstrated that the stronger platelet aggregation inhibitors prasugrel and ticagrelor are superior than clopidogrel at preventing significant adverse cardiovascular events following an acute coronary syndrome and percutaneous coronary intervention (PCI). Both, nevertheless, come with a higher risk of bleeding unrelated to a coronary artery bypass. As a prodrug, clopidogrel needs to be bioactivated, principally by the CYP2C19 enzyme. A CYP2C19 no function allele and diminished or absent CYP2C19 enzyme activity are present in about 30% of people. The reduced exposure to the active metabolite of clopidogrel and reduced inhibition of platelet aggregation among clopidogrel-treated carriers of a CYP2C19 no function allele likely contributed to the reduced efficacy of clopidogrel in clinical trials. Clopidogrel's pharmacogenetic results are strongest when used in conjunction with PCI, but evidence for other indications is growing. One of the most typical examples of clinical pharmacogenetic application is CYP2C19 genotype-guided antiplatelet medication following PCI. Guidance is available from expert consensus groups and regulatory bodies to assist with incorporating genetic information into P2Y12 inhibitor prescribing decisions. Here, we examine the data supporting genotype-guided P2Y12 inhibitor selection's effects on clopidogrel response and outcomes and discuss tips for pharmacogenetic implementation. We also discuss procedures for using genotype data to choose P2Y12 inhibitor therapies as well as any unmet research needs. Finally, choosing a P2Y12 inhibitor medication that optimally balances the atherothrombotic and bleeding risks may be influenced by both clinical and genetic factors.

Keywords: inhibitors, cardiovascular events, coronary intervention, pharmacogenetic implementation

Procedia PDF Downloads 92
914 Comparison of Different Hydrograph Routing Techniques in XPSTORM Modelling Software: A Case Study

Authors: Fatema Akram, Mohammad Golam Rasul, Mohammad Masud Kamal Khan, Md. Sharif Imam Ibne Amir

Abstract:

A variety of routing techniques are available to develop surface runoff hydrographs from rainfall. The selection of runoff routing method is very vital as it is directly related to the type of watershed and the required degree of accuracy. There are different modelling softwares available to explore the rainfall-runoff process in urban areas. XPSTORM, a link-node based, integrated storm-water modelling software, has been used in this study for developing surface runoff hydrograph for a Golf course area located in Rockhampton in Central Queensland in Australia. Four commonly used methods, namely SWMM runoff, Kinematic wave, Laurenson, and Time-Area are employed to generate runoff hydrograph for design storm of this study area. In runoff mode of XPSTORM, the rainfall, infiltration, evaporation and depression storage for sub-catchments were simulated and the runoff from the sub-catchment to collection node was calculated. The simulation results are presented, discussed and compared. The total surface runoff generated by SWMM runoff, Kinematic wave and Time-Area methods are found to be reasonably close, which indicates any of these methods can be used for developing runoff hydrograph of the study area. Laurenson method produces a comparatively less amount of surface runoff, however, it creates highest peak of surface runoff among all which may be suitable for hilly region. Although the Laurenson hydrograph technique is widely acceptable surface runoff routing technique in Queensland (Australia), extensive investigation is recommended with detailed topographic and hydrologic data in order to assess its suitability for use in the case study area.

Keywords: ARI, design storm, IFD, rainfall temporal pattern, routing techniques, surface runoff, XPSTORM

Procedia PDF Downloads 438
913 An Assessment on the Impact of Community Policing in Crime Prevention and Control in Fagge Local Government Area, Kano State, Nigeria

Authors: Aliyu Shitu Said

Abstract:

One of the major setbacks of every society is the proliferation of crimes that results in the inducement of fear, destruction of properties and loss of lives of people. The rising incidence of crime and general insecurity rate in the society and the inability of the policing agencies to curtail the menace necessitated the introduction of community policing in order to have a collaborative effort with community members in addressing the problem of crime. Thus, this study assessed the impact of community policing in crime prevention and control in Fagge Local Government area, Kano State, Nigeria. The study also examined the elements, roles, and challenges of community policing in crime prevention and control in the study area. The study adopted Broken Window and Routine Activity theories as frame of analysis. Mixed methods of data collection (quantitative and qualitative) were utilized for the study. Multi stage and purposive sampling techniques were adopted in selection of the study population. A total of 308 respondents were sampled for the study. These include 300 members of the public who were sampled through a multi stage sampling for questionnaire administration and 8 other respondents who were purposively sampled for in-depth interview. Findings of the study revealed that community policing has significant impact on crime prevention and control in the study area. Findings of the study further revealed that the elements and roles of community policing are effective and fully utilized, and there is cordial relationship between the police and the community members in the study area. This study therefore recommends that government should provide adequate support to community policing programmes and give more awareness to public, so as to boost the morale of the community in having a collaborative effort with the police in crime prevention and control.

Keywords: community, policing, crime, prevention, control

Procedia PDF Downloads 46
912 Theoretical Evaluation of Minimum Superheat, Energy and Exergy in a High-Temperature Heat Pump System Operating with Low GWP Refrigerants

Authors: Adam Y. Sulaiman, Donal F. Cotter, Ming J. Huang, Neil J. Hewitt

Abstract:

Suitable low global warming potential (GWP) refrigerants that conform to F-gas regulations are required to extend the operational envelope of high-temperature heat pumps (HTHPs) used for industrial waste heat recovery processes. The thermophysical properties and characteristics of these working fluids need to be assessed to provide a comprehensive understanding of operational effectiveness in HTHP applications. This paper presents the results of a theoretical simulation to investigate a range of low-GWP refrigerants and their suitability to supersede refrigerants HFC-245fa and HFC-365mfc. A steady-state thermodynamic model of a single-stage HTHP with an internal heat exchanger (IHX) was developed to assess system cycle characteristics at temperature ranges between 50 to 80 °C heat source and 90 to 150 °C heat sink. A practical approach to maximize the operational efficiency was examined to determine the effects of regulating minimum superheat within the process and subsequent influence on energetic and exergetic efficiencies. A comprehensive map of minimum superheat across the HTHP operating variables were used to assess specific tipping points in performance at 30 and 70 K temperature lifts. Based on initial results, the refrigerants HCFO-1233zd(E) and HFO-1336mzz(Z) were found to be closely aligned matches for refrigerants HFC-245fa and HFC-365mfc. The overall results show effective performance for HCFO-1233zd(E) occurs between 5-7 K minimum superheat, and HFO-1336mzz(Z) between 18-21 K dependant on temperature lift. This work provides a method to optimize refrigerant selection based on operational indicators to maximize overall HTHPs system performance.

Keywords: high-temperature heat pump, minimum superheat, energy & exergy efficiency, low GWP refrigerants

Procedia PDF Downloads 139
911 Examining the Relationship between Concussion and Neurodegenerative Disorders: A Review on Amyotrophic Lateral Sclerosis and Alzheimer’s Disease

Authors: Edward Poluyi, Eghosa Morgan, Charles Poluyi, Chibuikem Ikwuegbuenyi, Grace Imaguezegie

Abstract:

Background: Current epidemiological studies have examined the associations between moderate and severe traumatic brain injury (TBI) and their risks of developing neurodegenerative diseases. Concussion, also known as mild TBI (mTBI), is however quite distinct from moderate or severe TBIs. Only few studies in this burgeoning area have examined concussion—especially repetitive episodes—and neurodegenerative diseases. Thus, no definite relationship has been established between them. Objectives : This review will discuss the available literature linking concussion and amyotrophic lateral sclerosis (ALS) and Alzheimer’s disease (AD). Materials and Methods: Given the complexity of this subject, a realistic review methodology was selected which includes clarifying the scope and developing a theoretical framework, developing a search strategy, selection and appraisal, data extraction, and synthesis. A detailed literature matrix was set out in order to get relevant and recent findings on this topic. Results: Presently, there is no objective clinical test for the diagnosis of concussion because the features are less obvious on physical examination. Absence of an objective test in diagnosing concussion sometimes leads to skepticism when confirming the presence or absence of concussion. Intriguingly, several possible explanations have been proposed in the pathological mechanisms that lead to the development of some neurodegenerative disorders (such as ALS and AD) and concussion but the two major events are deposition of tau proteins (abnormal microtubule proteins) and neuroinflammation, which ranges from glutamate excitotoxicity pathways and inflammatory pathways (which leads to a rise in the metabolic demands of microglia cells and neurons), to mitochondrial function via the oxidative pathways.

Keywords: amyotrophic lateral sclerosis, Alzheimer's disease, mild traumatic brain injury, neurodegeneration

Procedia PDF Downloads 73
910 The Regional Novel in India: Its Emergence and Trajectory

Authors: Aruna Bommareddi

Abstract:

The journey of the novel is well examined in Indian academia as an offshoot of the novel in English. There have been many attempts to understand aspects of the early novel in India which shared a commonality with the English novel. The regional novel has had an entirely different trajectory which is mapped in the paper. The main focus of the paper would be to look at the historical emergence of the genre of the regional novel in Indian Literatures with specific reference to Kannada, Hindi, and Bengali. The selection of these languages is guided not only by familiarity with these languages as also based on the significance that these languages enjoy in the sub-continent and for the emergence of the regional novel as a specific category in these languages. The regional novels under study are Phaneeswaranath Renu’s Maila Anchal, Tarashankar Bandopadhyaya’s Ganadevata, and Kuvempu’s House of Kanuru for exploration of the themes of its emergence and some aspects of the regional novel common to and different from each other. The paper would explore the various movements that have shaped the genre regional novel in these Literatures. Though Phaneeswarnath Renu’s Maila Anchal is published in 1956, the novel is set in pre-Independent India and therefore shares a commonality of themes with the other two novels, House of Kanuru and Ganadevata. All three novels explore themes of superstition, ignorance, poverty, and the interventions of educated youth to salvage the crises in these backward regional worlds. In fact, it was Renu who assertively declared that he was going to write a regional novel and hence the tile of the first regional novel in Hindi is Maila Anchal meaning the soiled border. In Hindi, anchal also means the region therefore, the title is suggestive of a dirty region as well. The novel exposes the squalor, ignorance, and the conflict ridden life of the village or region as opposed to the rosy image of the village in literature. With this, all such novels which depicted conflicts of the region got recognized as regional novels even though they may have been written prior to Renu’s declaration. All three novels under study succeed in bringing out the complexity of rural life at a given point of time in its history.

Keywords: bengali, hindi, kannada, regional novel, telugu

Procedia PDF Downloads 63
909 Constructing Digital Memory for Chinese Ancient Village: A Case on Village of Gaoqian

Authors: Linqing Ma, Huiling Feng, Jihong Liang, Yi Qian

Abstract:

In China, some villages have survived in the long history of changes and remain until today with their unique styles and featured culture developed in the past. Those ancient villages, usually aged for hundreds or thousands of years, are the mirror for traditional Chinese culture, especially the farming-studying culture represented by the Confucianism. Gaoqian, an ancient village with a population of 3,000 in Zhejiang province, is such a case. With a history dating back to Yuan Dynasty, Gaoqian Village has 13 well-preserved traditional Chinese houses with a courtyard, which were built in the Ming and Qing Dynasty. It is a fine specimen to study traditional rural China. In China, some villages have survived in the long history of changes and remain until today with their unique styles and featured culture developed in the past. Those ancient villages, usually aged for hundreds or thousands of years, are the mirror for traditional Chinese culture, especially the farming-studying culture represented by the Confucianism. Gaoqian, an ancient village with a population of 3,000 in Zhejiang province, is such a case. With a history dating back to Yuan Dynasty, Gaoqian Village has 13 well-preserved traditional Chinese houses with a courtyard, which were built in the Ming and Qing Dynasty. It is a fine specimen to study traditional rural China. Then a repository for the memory of the Village will be completed by doing arrangement and description for those multimedia resources such as texts, photos, videos and so on. Production of Creative products with digital technologies is also possible based a thorough understanding of the culture feature of Gaoqian Village using research tools for literature and history studies and a method of comparative study. Finally, the project will construct an exhibition platform for the Village and its culture by telling its stories with completed structures and treads.

Keywords: ancient villages, digital exhibition, multimedia, traditional culture

Procedia PDF Downloads 562
908 Life Cycle Analysis of the Antibacterial Gel Product Using Iso 14040 and Recipe 2016 Method

Authors: Pablo Andres Flores Siguenza, Noe Rodrigo Guaman Guachichullca

Abstract:

Sustainable practices have received increasing attention from academics and companies in recent decades due to, among many factors, the market advantages they generate, global commitments, and policies aimed at reducing greenhouse gas emissions, addressing resource scarcity, and rethinking waste management. The search for ways to promote sustainability leads industries to abandon classical methods and resort to the use of innovative strategies, which in turn are based on quantitative analysis methods and tools such as life cycle analysis (LCA), which is the basis for sustainable production and consumption, since it is a method that analyzes objectively, methodically, systematically, and scientifically the environmental impact caused by a process/product during its entire life cycle. The objective of this study is to develop an LCA of the antibacterial gel product throughout its entire supply chain (SC) under the methodology of ISO 14044 with the help of Gabi software and the Recipe 2016 method. The selection of the case study product was made based on its relevance in the current context of the COVID-19 pandemic and its exponential increase in production. For the development of the LCA, data from a Mexican company are used, and 3 scenarios are defined to obtain the midpoint and endpoint environmental impacts both by phases and globally. As part of the results, the most outstanding environmental impact categories are climate change, fossil fuel depletion, and terrestrial ecotoxicity, and the stage that generates the most pollution in the entire SC is the extraction of raw materials. The study serves as a basis for the development of different sustainability strategies, demonstrates the usefulness of an LCA, and agrees with different authors on the role and importance of this methodology in sustainable development.

Keywords: sustainability, sustainable development, life cycle analysis, environmental impact, antibacterial gel

Procedia PDF Downloads 23
907 Analysis of Citation Rate and Data Reuse for Openly Accessible Biodiversity Datasets on Global Biodiversity Information Facility

Authors: Nushrat Khan, Mike Thelwall, Kayvan Kousha

Abstract:

Making research data openly accessible has been mandated by most funders over the last 5 years as it promotes reproducibility in science and reduces duplication of effort to collect the same data. There are evidence that articles that publicly share research data have higher citation rates in biological and social sciences. However, how and whether shared data is being reused is not always intuitive as such information is not easily accessible from the majority of research data repositories. This study aims to understand the practice of data citation and how data is being reused over the years focusing on biodiversity since research data is frequently reused in this field. Metadata of 38,878 datasets including citation counts were collected through the Global Biodiversity Information Facility (GBIF) API for this purpose. GBIF was used as a data source since it provides citation count for datasets, not a commonly available feature for most repositories. Analysis of dataset types, citation counts, creation and update time of datasets suggests that citation rate varies for different types of datasets, where occurrence datasets that have more granular information have higher citation rates than checklist and metadata-only datasets. Another finding is that biodiversity datasets on GBIF are frequently updated, which is unique to this field. Majority of the datasets from the earliest year of 2007 were updated after 11 years, with no dataset that was not updated since creation. For each year between 2007 and 2017, we compared the correlations between update time and citation rate of four different types of datasets. While recent datasets do not show any correlations, 3 to 4 years old datasets show weak correlation where datasets that were updated more recently received high citations. The results are suggestive that it takes several years to cumulate citations for research datasets. However, this investigation found that when searched on Google Scholar or Scopus databases for the same datasets, the number of citations is often not the same as GBIF. Hence future aim is to further explore the citation count system adopted by GBIF to evaluate its reliability and whether it can be applicable to other fields of studies as well.

Keywords: data citation, data reuse, research data sharing, webometrics

Procedia PDF Downloads 158
906 Intermediate-Term Impact of Taiwan High-Speed Rail (HSR) and Land Use on Spatial Patterns of HSR Travel

Authors: Tsai Yu-hsin, Chung Yi-Hsin

Abstract:

The employment of an HSR system, resulting in elevation in the inter-city/-region accessibility, is likely to promote spatial interaction between places in the HSR and extended territory. The inter-city/-region travel via HSR could be, among others, affected by the land use, transportation, and location of the HSR station at both trip origin and destination ends. However, relatively few insights have been shed on these impacts and spatial patterns of the HSR travel. The research purposes, as phase one of a series of HSR related research, of this study are threefold: to analyze the general spatial patterns of HSR trips, such as the spatial distribution of trip origins and destinations; to analyze if specific land use, transportation characteristics, and trip characteristics affect HSR trips in terms of the use of HSR, the distribution of trip origins and destinations, and; to analyze the socio-economic characteristics of HSR travelers. With the Taiwan HSR starting operation in 2007, this study emphasizes on the intermediate-term impact of HSR, which is made possible with the population and housing census and industry and commercial census data and a station area intercept survey conducted in the summer 2014. The analysis will be conducted at the city, inter-city, and inter-region spatial levels, as necessary and required. The analysis tools include descriptive statistics and multivariate analysis with the assistance of SPSS, HLM and ArcGIS. The findings, on the one hand, can provide policy implications for associated land use, transportation plan and the site selection of HSR station. On the other hand, on the travel the findings are expected to provide insights that can help explain how land use and real estate values could be affected by HSR in following phases of this series of research.

Keywords: high speed rail, land use, travel, spatial pattern

Procedia PDF Downloads 439
905 Exploring the Growth Path under Coupling Relationship between Space and Economy of Mountain Village and Townlets: Case Study of Southwest China

Authors: Runlin Liu, Shilong Li

Abstract:

China is a mountainous country, with two-thirds of its territory covered by plateaus, hills, and mountains, and nearly half of the cities and towns are distributed in mountainous areas. Compared with the environmental constraints in the development path of cities and towns in the plains, there are heterogeneities in aspects such as spatial characteristics, growth mode, and ecological protection and so on for mountain village and townlets, and the development path of mountain village and townlets has a bidirectional relationship between mountain space and economic growth. Based on classical growth theory, this article explores the two-dimensional coupling relation between space and economy in mountain village and townlets under background of rural rejuvenation. GIS technology is adopted in the study to analyze spatial trends and patterns, economical spatial differentiation characteristics of village and townlets. This powerful tool can also help differentiate and analyze limiting factors and assessment systems in the economic growth of village and townlets under spatial dimension of mountainous space. To make the research more specific, this article selects mountain village and townlets in Southwest China as the object of study; this provides good cases for analyzing parallel coupling mechanism of the duality structure system between economic growth and spatial expansion and discussing the path selection of spatial economic growth of mountain village and towns with multiple constraints. The research results can provide quantitative references for the spatial and economic development paths of mountain villages and towns, which is helpful in realizing efficient and high-quality development mode with equal emphasis on spatial and economic benefits for these type of towns.

Keywords: coupling mechanism, geographic information technology, mountainous town, synergetic development, spatial economy

Procedia PDF Downloads 130
904 An Advanced Automated Brain Tumor Diagnostics Approach

Authors: Berkan Ural, Arif Eser, Sinan Apaydin

Abstract:

Medical image processing is generally become a challenging task nowadays. Indeed, processing of brain MRI images is one of the difficult parts of this area. This study proposes a hybrid well-defined approach which is consisted from tumor detection, extraction and analyzing steps. This approach is mainly consisted from a computer aided diagnostics system for identifying and detecting the tumor formation in any region of the brain and this system is commonly used for early prediction of brain tumor using advanced image processing and probabilistic neural network methods, respectively. For this approach, generally, some advanced noise removal functions, image processing methods such as automatic segmentation and morphological operations are used to detect the brain tumor boundaries and to obtain the important feature parameters of the tumor region. All stages of the approach are done specifically with using MATLAB software. Generally, for this approach, firstly tumor is successfully detected and the tumor area is contoured with a specific colored circle by the computer aided diagnostics program. Then, the tumor is segmented and some morphological processes are achieved to increase the visibility of the tumor area. Moreover, while this process continues, the tumor area and important shape based features are also calculated. Finally, with using the probabilistic neural network method and with using some advanced classification steps, tumor area and the type of the tumor are clearly obtained. Also, the future aim of this study is to detect the severity of lesions through classes of brain tumor which is achieved through advanced multi classification and neural network stages and creating a user friendly environment using GUI in MATLAB. In the experimental part of the study, generally, 100 images are used to train the diagnostics system and 100 out of sample images are also used to test and to check the whole results. The preliminary results demonstrate the high classification accuracy for the neural network structure. Finally, according to the results, this situation also motivates us to extend this framework to detect and localize the tumors in the other organs.

Keywords: image processing algorithms, magnetic resonance imaging, neural network, pattern recognition

Procedia PDF Downloads 392
903 Automatic Processing of Trauma-Related Visual Stimuli in Female Patients Suffering From Post-Traumatic Stress Disorder after Interpersonal Traumatization

Authors: Theresa Slump, Paula Neumeister, Katharina Feldker, Carina Y. Heitmann, Thomas Straube

Abstract:

A characteristic feature of post-traumatic stress disorder (PTSD) is the automatic processing of disorder-specific stimuli that expresses itself in intrusive symptoms such as intense physical and psychological reactions to trauma-associated stimuli. That automatic processing plays an essential role in the development and maintenance of symptoms. The aim of our study was, therefore, to investigate the behavioral and neural correlates of automatic processing of trauma-related stimuli in PTSD. Although interpersonal traumatization is a form of traumatization that often occurs, it has not yet been sufficiently studied. That is why, in our study, we focused on patients suffering from interpersonal traumatization. While previous imaging studies on PTSD mainly used faces, words, or generally negative visual stimuli, our study presented complex trauma-related and neutral visual scenes. We examined 19 female subjects suffering from PTSD and examined 19 healthy women as a control group. All subjects did a geometric comparison task while lying in a functional-magnetic-resonance-imaging (fMRI) scanner. Trauma-related scenes and neutral visual scenes that were not relevant to the task were presented while the subjects were doing the task. Regarding the behavioral level, there were not any significant differences between the task performance of the two groups. Regarding the neural level, the PTSD patients showed significant hyperactivation of the hippocampus for task-irrelevant trauma-related stimuli versus neutral stimuli when compared with healthy control subjects. Connectivity analyses revealed altered connectivity between the hippocampus and other anxiety-related areas in PTSD patients, too. Overall, those findings suggest that fear-related areas are involved in PTSD patients' processing of trauma-related stimuli even if the stimuli that were used in the study were task-irrelevant.

Keywords: post-traumatic stress disorder, automatic processing, hippocampus, functional magnetic resonance imaging

Procedia PDF Downloads 181
902 Enhancing Higher Education Teaching and Learning Processes: Examining How Lecturer Evaluation Make a Difference

Authors: Daniel Asiamah Ameyaw

Abstract:

This research attempts to investigate how lecturer evaluation makes a difference in enhancing higher education teaching and learning processes. The research questions to guide this research work states first as, “What are the perspectives on the difference made by evaluating academic teachers in order to enhance higher education teaching and learning processes?” and second, “What are the implications of the findings for Policy and Practice?” Data for this research was collected mainly through interviewing and partly documents review. Data analysis was conducted under the framework of grounded theory. The findings showed that for individual lecturer level, lecturer evaluation provides a continuous improvement of teaching strategies, and serves as source of data for research on teaching. At the individual student level, it enhances students learning process; serving as source of information for course selection by students; and by making students feel recognised in the educational process. At the institutional level, it noted that lecturer evaluation is useful in personnel and management decision making; it assures stakeholders of quality teaching and learning by setting up standards for lecturers; and it enables institutions to identify skill requirement and needs as a basis for organising workshops. Lecturer evaluation is useful at national level in terms of guaranteeing the competencies of graduates who then provide the needed manpower requirement of the nation. Besides, it mentioned that resource allocation to higher educational institution is based largely on quality of the programmes being run by the institution. The researcher concluded, that the findings have implications for policy and practice, therefore, higher education managers are expected to ensure that policy is implemented as planned by policy-makers so that the objectives can successfully be achieved.

Keywords: academic quality, higher education, lecturer evaluation, teaching and learning processes

Procedia PDF Downloads 127
901 Passive Vibration Isolation Analysis and Optimization for Mechanical Systems

Authors: Ozan Yavuz Baytemir, Ender Cigeroglu, Gokhan Osman Ozgen

Abstract:

Vibration is an important issue in the design of various components of aerospace, marine and vehicular applications. In order not to lose the components’ function and operational performance, vibration isolation design involving the optimum isolator properties selection and isolator positioning processes appear to be a critical study. Knowing the growing need for the vibration isolation system design, this paper aims to present two types of software capable of implementing modal analysis, response analysis for both random and harmonic types of excitations, static deflection analysis, Monte Carlo simulations in addition to study of parameter and location optimization for different types of isolation problem scenarios. Investigating the literature, there is no such study developing a software-based tool that is capable of implementing all those analysis, simulation and optimization studies in one platform simultaneously. In this paper, the theoretical system model is generated for a 6-DOF rigid body. The vibration isolation system of any mechanical structure is able to be optimized using hybrid method involving both global search and gradient-based methods. Defining the optimization design variables, different types of optimization scenarios are listed in detail. Being aware of the need for a user friendly vibration isolation problem solver, two types of graphical user interfaces (GUIs) are prepared and verified using a commercial finite element analysis program, Ansys Workbench 14.0. Using the analysis and optimization capabilities of those GUIs, a real application used in an air-platform is also presented as a case study at the end of the paper.

Keywords: hybrid optimization, Monte Carlo simulation, multi-degree-of-freedom system, parameter optimization, location optimization, passive vibration isolation analysis

Procedia PDF Downloads 546
900 Effect of Chemical Mutagen on Seeds Germination of Lima Bean

Authors: G. Ultanbekova, Zh. Suleimenova, Zh. Rakhmetova, G. Mombekova, S. Mantieva

Abstract:

Plant Growth Promoting Rhizobacteria (PGPR) are a group of free-living bacteria that colonize the rhizosphere, enhance plant growth of many cereals and other important agricultural crops and protect plants from disease and abiotic stresses through a wide variety of mechanisms. The use of PGPR has been proven to be an environmentally sound way of increasing crop yields by facilitating plant growth. In the present study, strain improvement of PGPR isolates were carried out by chemical mutagenesis for the improvement of growth and yield of lima bean. Induced mutagenesis is widely used for the selection of microorganisms producing biologically active substances and further improving their activities. Strain improvement is usually done by classical mutagenesis which involves exposing the microbes to chemical or physical mutagens. The strains of Pseudomonas putida 4/1, Azotobacter chroococcum Р-29 and Bacillus subtilis were subjected to mutation process for strain improvement by treatment with a chemical agent (sodium nitrite) to cause mutation and were observed for its consequent action on the seeds germination and plant growth of lima bean (Phaseolus lunatus). Bacterial mutant strains of Pseudomonas putida M-1, Azotobacter chroococcum M-1 and Bacillus subtilis M-1, treated with sodium nitrite in the concentration of 5 mg/ml for 120 min, were found effective to enhance the germination of lima bean seeds compared to parent strains. Moreover, treatment of the lima bean seeds with a mutant strain of Bacillus subtilis M-1 had a significant stimulation effect on plant growth. The length of the stems and roots of lima bean treated with Bacillus subtilis M-1 increased significantly in comparison with parent strain in 1.6 and 1.3 times, respectively.

Keywords: chemical mutagenesis, germination, kidney bean, plant growth promoting rhizobacteria (PGPR)

Procedia PDF Downloads 178
899 Control of a Quadcopter Using Genetic Algorithm Methods

Authors: Mostafa Mjahed

Abstract:

This paper concerns the control of a nonlinear system using two different methods, reference model and genetic algorithm. The quadcopter is a nonlinear unstable system, which is a part of aerial robots. It is constituted by four rotors placed at the end of a cross. The center of this cross is occupied by the control circuit. Its motions are governed by six degrees of freedom: three rotations around 3 axes (roll, pitch and yaw) and the three spatial translations. The control of such system is complex, because of nonlinearity of its dynamic representation and the number of parameters, which it involves. Numerous studies have been developed to model and stabilize such systems. The classical PID and LQ correction methods are widely used. If the latter represent the advantage to be simple because they are linear, they reveal the drawback to require the presence of a linear model to synthesize. It also implies the complexity of the established laws of command because the latter must be widened on all the domain of flight of these quadcopter. Note that, if the classical design methods are widely used to control aeronautical systems, the Artificial Intelligence methods as genetic algorithms technique receives little attention. In this paper, we suggest comparing two PID design methods. Firstly, the parameters of the PID are calculated according to the reference model. In a second phase, these parameters are established using genetic algorithms. By reference model, we mean that the corrected system behaves according to a reference system, imposed by some specifications: settling time, zero overshoot etc. Inspired from the natural evolution of Darwin's theory advocating the survival of the best, John Holland developed this evolutionary algorithm. Genetic algorithm (GA) possesses three basic operators: selection, crossover and mutation. We start iterations with an initial population. Each member of this population is evaluated through a fitness function. Our purpose is to correct the behavior of the quadcopter around three axes (roll, pitch and yaw) with 3 PD controllers. For the altitude, we adopt a PID controller.

Keywords: quadcopter, genetic algorithm, PID, fitness, model, control, nonlinear system

Procedia PDF Downloads 409
898 Factors Affecting M-Government Deployment and Adoption

Authors: Saif Obaid Alkaabi, Nabil Ayad

Abstract:

Governments constantly seek to offer faster, more secure, efficient and effective services for their citizens. Recent changes and developments to communication services and technologies, mainly due the Internet, have led to immense improvements in the way governments of advanced countries carry out their interior operations Therefore, advances in e-government services have been broadly adopted and used in various developed countries, as well as being adapted to developing countries. The implementation of advances depends on the utilization of the most innovative structures of data techniques, mainly in web dependent applications, to enhance the main functions of governments. These functions, in turn, have spread to mobile and wireless techniques, generating a new advanced direction called m-government. This paper discusses a selection of available m-government applications and several business modules and frameworks in various fields. Practically, the m-government models, techniques and methods have become the improved version of e-government. M-government offers the potential for applications which will work better, providing citizens with services utilizing mobile communication and data models incorporating several government entities. Developing countries can benefit greatly from this innovation due to the fact that a large percentage of their population is young and can adapt to new technology and to the fact that mobile computing devices are more affordable. The use of models of mobile transactions encourages effective participation through the use of mobile portals by businesses, various organizations, and individual citizens. Although the application of m-government has great potential, it does have major limitations. The limitations include: the implementation of wireless networks and relative communications, the encouragement of mobile diffusion, the administration of complicated tasks concerning the protection of security (including the ability to offer privacy for information), and the management of the legal issues concerning mobile applications and the utilization of services.

Keywords: e-government, m-government, system dependability, system security, trust

Procedia PDF Downloads 365
897 MIMO Radar-Based System for Structural Health Monitoring and Geophysical Applications

Authors: Davide D’Aria, Paolo Falcone, Luigi Maggi, Aldo Cero, Giovanni Amoroso

Abstract:

The paper presents a methodology for real-time structural health monitoring and geophysical applications. The key elements of the system are a high performance MIMO RADAR sensor, an optical camera and a dedicated set of software algorithms encompassing interferometry, tomography and photogrammetry. The MIMO Radar sensor proposed in this work, provides an extremely high sensitivity to displacements making the system able to react to tiny deformations (up to tens of microns) with a time scale which spans from milliseconds to hours. The MIMO feature of the system makes the system capable of providing a set of two-dimensional images of the observed scene, each mapped on the azimuth-range directions with noticeably resolution in both the dimensions and with an outstanding repetition rate. The back-scattered energy, which is distributed in the 3D space, is projected on a 2D plane, where each pixel has as coordinates the Line-Of-Sight distance and the cross-range azimuthal angle. At the same time, the high performing processing unit allows to sense the observed scene with remarkable refresh periods (up to milliseconds), thus opening the way for combined static and dynamic structural health monitoring. Thanks to the smart TX/RX antenna array layout, the MIMO data can be processed through a tomographic approach to reconstruct the three-dimensional map of the observed scene. This 3D point cloud is then accurately mapped on a 2D digital optical image through photogrammetric techniques, allowing for easy and straightforward interpretations of the measurements. Once the three-dimensional image is reconstructed, a 'repeat-pass' interferometric approach is exploited to provide the user of the system with high frequency three-dimensional motion/vibration estimation of each point of the reconstructed image. At this stage, the methodology leverages consolidated atmospheric correction algorithms to provide reliable displacement and vibration measurements.

Keywords: interferometry, MIMO RADAR, SAR, tomography

Procedia PDF Downloads 171
896 A Trend Based Forecasting Framework of the ATA Method and Its Performance on the M3-Competition Data

Authors: H. Taylan Selamlar, I. Yavuz, G. Yapar

Abstract:

It is difficult to make predictions especially about the future and making accurate predictions is not always easy. However, better predictions remain the foundation of all science therefore the development of accurate, robust and reliable forecasting methods is very important. Numerous number of forecasting methods have been proposed and studied in the literature. There are still two dominant major forecasting methods: Box-Jenkins ARIMA and Exponential Smoothing (ES), and still new methods are derived or inspired from them. After more than 50 years of widespread use, exponential smoothing is still one of the most practically relevant forecasting methods available due to their simplicity, robustness and accuracy as automatic forecasting procedures especially in the famous M-Competitions. Despite its success and widespread use in many areas, ES models have some shortcomings that negatively affect the accuracy of forecasts. Therefore, a new forecasting method in this study will be proposed to cope with these shortcomings and it will be called ATA method. This new method is obtained from traditional ES models by modifying the smoothing parameters therefore both methods have similar structural forms and ATA can be easily adapted to all of the individual ES models however ATA has many advantages due to its innovative new weighting scheme. In this paper, the focus is on modeling the trend component and handling seasonality patterns by utilizing classical decomposition. Therefore, ATA method is expanded to higher order ES methods for additive, multiplicative, additive damped and multiplicative damped trend components. The proposed models are called ATA trended models and their predictive performances are compared to their counter ES models on the M3 competition data set since it is still the most recent and comprehensive time-series data collection available. It is shown that the models outperform their counters on almost all settings and when a model selection is carried out amongst these trended models ATA outperforms all of the competitors in the M3- competition for both short term and long term forecasting horizons when the models’ forecasting accuracies are compared based on popular error metrics.

Keywords: accuracy, exponential smoothing, forecasting, initial value

Procedia PDF Downloads 161
895 Self-Assembly of TaC@Ta Core-Shell-Like Nanocomposite Film via Solid-State Dewetting: Toward Superior Wear and Corrosion Resistance

Authors: Ping Ren, Mao Wen, Kan Zhang, Weitao Zheng

Abstract:

The improvement of comprehensive properties including hardness, toughness, wear, and corrosion resistance in the transition metal carbides/nitrides TMCN films, especially avoiding the trade-off between hardness and toughness, is strongly required to adapt to various applications. Although incorporating ductile metal DM phase into the TMCN via thermally-induced phase separation has been emerged as an effective approach to toughen TMCN-based films, the DM is just limited to some soft ductile metal (i.e. Cu, Ag, Au immiscibility with the TMCN. Moreover, hardness is highly sensitive to soft DM content and can be significantly worsened. Hence, a novel preparation method should be attempted to broaden the DM selection and assemble much more ordered nanocomposite structure for improving the comprehensive properties. Here, we provide a new strategy, by activating solid-state dewetting during layered deposition, to accomplish the self-assembly of ordered TaC@Ta core-shell-like nanocomposite film consisting of TaC nanocrystalline encapsulated with thin pseudocrystal Ta tissue. That results in the superhard (~45.1 GPa) dominated by Orowan strengthening mechanism and high toughness attributed to indenter-induced phase transformation from the pseudocrystal to body-centered cubic Ta, together with the drastically enhanced wear and corrosion resistance. Furthermore, very thin pseudocrystal Ta encapsulated layer (~1.5 nm) in the TaC@Ta core-shell-like structure helps for promoting the formation of lubricious TaOₓ Magnéli phase during sliding, thereby further dropping the coefficient of friction. Apparently, solid-state dewetting may provide a new route to construct ordered TMC(N)@TM core-shell-like nanocomposite capable of combining superhard, high toughness, low friction, superior wear with corrosion resistance.

Keywords: corrosion, nanocomposite film, solid-state dewetting, tribology

Procedia PDF Downloads 121
894 Using Serious Games to Integrate the Potential of Mass Customization into the Fuzzy Front-End of New Product Development

Authors: Michael N. O'Sullivan, Con Sheahan

Abstract:

Mass customization is the idea of offering custom products or services to satisfy the needs of each individual customer while maintaining the efficiency of mass production. Technologies like 3D printing and artificial intelligence have many start-ups hoping to capitalize on this dream of creating personalized products at an affordable price, and well established companies scrambling to innovate and maintain their market share. However, the majority of them are failing as they struggle to understand one key question – where does customization make sense? Customization and personalization only make sense where the value of the perceived benefit outweighs the cost to implement it. In other words, will people pay for it? Looking at the Kano Model makes it clear that it depends on the product. In products where customization is an inherent need, like prosthetics, mass customization technologies can be highly beneficial. However, for products that already sell as a standard, like headphones, offering customization is likely only an added bonus, and so the product development team must figure out if the customers’ perception of the added value of this feature will outweigh its premium price tag. This can be done through the use of a ‘serious game,’ whereby potential customers are given a limited budget to collaboratively buy and bid on potential features of the product before it is developed. If the group choose to buy customization over other features, then the product development team should implement it into their design. If not, the team should prioritize the features on which the customers have spent their budget. The level of customization purchased can also be translated to an appropriate production method, for example, the most expensive type of customization would likely be free-form design and could be achieved through digital fabrication, while a lower level could be achieved through short batch production. Twenty-five teams of final year students from design, engineering, construction and technology tested this methodology when bringing a product from concept through to production specification, and found that it allowed them to confidently decide what level of customization, if any, would be worth offering for their product, and what would be the best method of producing it. They also found that the discussion and negotiations between players during the game led to invaluable insights, and often decided to play a second game where they offered customers the option to buy the various customization ideas that had been discussed during the first game.

Keywords: Kano model, mass customization, new product development, serious game

Procedia PDF Downloads 118
893 Eli-Twist Spun Yarn: An Alternative to Conventional Sewing Thread

Authors: Sujit Kumar Sinha, Madan Lal Regar

Abstract:

Sewing thread plays an important role in the transformation of a two-dimensional fabric into a three-dimensional garment. The interaction of the sewing thread with the fabric at the seam not only influences the appearance of a garment but also its performance. Careful selection of sewing thread and associated parameters can only help in improvement. Over the years, ring spinning has been dominating the yarn market. In the pursuit of improvement to challenge its dominance alternative technology has also been developed. But no real challenge has been posed by the any of the developed spinning systems. Eli-Twist spinning system can be a new method of yarn manufacture to provide a product with improved mechanical and physical properties with respect to the conventional ring spun yarn. The system, patented by Suessen has gained considerable attention in the recent times. The process of produces a two-ply compact yarn with improved fiber utilization. It produces a novel structure combining all advantages of condensing and doubling. In the present study, sewing threads of three different counts each from cotton, polyester and polyester/cotton (50/50) blend were produced on a ring and Eli-Twist systems. A twist multiplier of 4.2 was used to produce all the yarns. A comparison of hairiness, tensile strength and coefficient of friction with conventional ring yarn was made. Eli-Twist yarn has shown better frictional characteristics, better tensile strength and less hairiness. The performance of the Eli-Twist sewing thread has also been found to be better than the conventional 2-ply sewing thread. The performance was estimated through seam strength, seam elongation and seam efficiency of sewn fabric. Eli-Twist sewing thread has shown less friction, less hairiness, and higher tensile strength. Eli-Twist sewing thread resulted in better seam characteristics in comparison to conventional 2-ply sewing thread.

Keywords: ring spun yarn, Eli-Twist yarn, sewing thread, seam strength, seam elongation, seam efficiency

Procedia PDF Downloads 171
892 Transient Response of Elastic Structures Subjected to a Fluid Medium

Authors: Helnaz Soltani, J. N. Reddy

Abstract:

Presence of fluid medium interacting with a structure can lead to failure of the structure. Since developing efficient computational model for fluid-structure interaction (FSI) problems has broader impact to realistic problems encountered in aerospace industry, ship industry, oil and gas industry, and so on, one can find an increasing need to find a method in order to investigate the effect of fluid domain on structural response. A coupled finite element formulation of problems involving FSI issue is an accurate method to predict the response of structures in contact with a fluid medium. This study proposes a finite element approach in order to study the transient response of the structures interacting with a fluid medium. Since beam and plate are considered to be the fundamental elements of almost any structure, the developed method is applied to beams and plates benchmark problems in order to demonstrate its efficiency. The formulation is a combination of the various structure theories and the solid-fluid interface boundary condition, which is used to represent the interaction between the solid and fluid regimes. Here, three different beam theories as well as three different plate theories are considered to model the solid medium, and the Navier-Stokes equation is used as the theoretical equation governed the fluid domain. For each theory, a coupled set of equations is derived where the element matrices of both regimes are calculated by Gaussian quadrature integration. The main feature of the proposed methodology is to model the fluid domain as an added mass; the external distributed force due to the presence of the fluid. We validate the accuracy of such formulation by means of some numerical examples. Since the formulation presented in this study covers several theories in literature, the applicability of our proposed approach is independent of any structure geometry. The effect of varying parameters such as structure thickness ratio, fluid density and immersion depth, are studied using numerical simulations. The results indicate that maximum vertical deflection of the structure is affected considerably in the presence of a fluid medium.

Keywords: beam and plate, finite element analysis, fluid-structure interaction, transient response

Procedia PDF Downloads 546
891 Chronolgy and Developments in Inventory Control Best Practices for FMCG Sector

Authors: Roopa Singh, Anurag Singh, Ajay

Abstract:

Agriculture contributes a major share in the national economy of India. A major portion of Indian economy (about 70%) depends upon agriculture as it forms the main source of income. About 43% of India’s geographical area is used for agricultural activity which involves 65-75% of total population of India. The given work deals with the Fast moving Consumer Goods (FMCG) industries and their inventories which use agricultural produce as their raw material or input for their final product. Since the beginning of inventory practices, many developments took place which can be categorised into three phases, based on the review of various works. The first phase is related with development and utilization of Economic Order Quantity (EOQ) model and methods for optimizing costs and profits. Second phase deals with inventory optimization method, with the purpose of balancing capital investment constraints and service level goals. The third and recent phase has merged inventory control with electrical control theory. Maintenance of inventory is considered negative, as a large amount of capital is blocked especially in mechanical and electrical industries. But the case is different in food processing and agro-based industries and their inventories due to cyclic variation in the cost of raw materials of such industries which is the reason for selection of these industries in the mentioned work. The application of electrical control theory in inventory control makes the decision-making highly instantaneous for FMCG industries without loss in their proposed profits, which happened earlier during first and second phases, mainly due to late implementation of decision. The work also replaces various inventories and work-in-progress (WIP) related errors with their monetary values, so that the decision-making is fully target-oriented.

Keywords: control theory, inventory control, manufacturing sector, EOQ, feedback, FMCG sector

Procedia PDF Downloads 340
890 Alternative Approach to the Machine Vision System Operating for Solving Industrial Control Issue

Authors: M. S. Nikitenko, S. A. Kizilov, D. Y. Khudonogov

Abstract:

The paper considers an approach to a machine vision operating system combined with using a grid of light markers. This approach is used to solve several scientific and technical problems, such as measuring the capability of an apron feeder delivering coal from a lining return port to a conveyor in the technology of mining high coal releasing to a conveyor and prototyping an autonomous vehicle obstacle detection system. Primary verification of a method of calculating bulk material volume using three-dimensional modeling and validation in laboratory conditions with relative errors calculation were carried out. A method of calculating the capability of an apron feeder based on a machine vision system and a simplifying technology of a three-dimensional modelled examined measuring area with machine vision was offered. The proposed method allows measuring the volume of rock mass moved by an apron feeder using machine vision. This approach solves the volume control issue of coal produced by a feeder while working off high coal by lava complexes with release to a conveyor with accuracy applied for practical application. The developed mathematical apparatus for measuring feeder productivity in kg/s uses only basic mathematical functions such as addition, subtraction, multiplication, and division. Thus, this fact simplifies software development, and this fact expands the variety of microcontrollers and microcomputers suitable for performing tasks of calculating feeder capability. A feature of an obstacle detection issue is to correct distortions of the laser grid, which simplifies their detection. The paper presents algorithms for video camera image processing and autonomous vehicle model control based on obstacle detection machine vision systems. A sample fragment of obstacle detection at the moment of distortion with the laser grid is demonstrated.

Keywords: machine vision, machine vision operating system, light markers, measuring capability, obstacle detection system, autonomous transport

Procedia PDF Downloads 94
889 Identifying, Reporting and Preventing Medical Errors Among Nurses Working in Critical Care Units At Kenyatta National Hospital, Kenya: Closing the Gap Between Attitude and Practice

Authors: Jared Abuga, Wesley Too

Abstract:

Medical error is the third leading cause of death in US, with approximately 98,000 deaths occurring every year as a result of medical errors. The world financial burden of medication errors is roughly USD 42 billion. Medication errors may lead to at least one death daily and injure roughly 1.3 million people every year. Medical error reporting is essential in creating a culture of accountability in our healthcare system. Studies have shown that attitudes and practice of healthcare workers in reporting medical errors showed that the major factors in under-reporting of errors included work stress and fear of medico-legal consequences due to the disclosure of error. Further, the majority believed that increase in reporting medical errors would contribute to a better system. Most hospitals depend on nurses to discover medication errors because they are considered to be the sources of these errors, as contributors or mere observers, consequently, the nurse’s perception of medication errors and what needs to be done is a vital feature to reducing incidences of medication errors. We sought to explore knowledge among nurses on medical errors and factors affecting or hindering reporting of medical errors among nurses working at the emergency unit, KNH. Critical care nurses are faced with many barriers to completing incident reports on medication errors. One of these barriers which contribute to underreporting is a lack of education and/or knowledge regarding medication errors and the reporting process. This study, therefore, sought to determine the availability and the use of reporting systems for medical errors in critical care unity. It also sought to establish nurses’ perception regarding medical errors and reporting and document factors facilitating timely identification and reporting of medical errors in critical care settings. Methods: The study used cross-section study design to collect data from 76 critical care nurses from Kenyatta Teaching & Research National Referral Hospital, Kenya. Data analysis and results is ongoing. By October 2022, we will have analysis, results, discussions, and recommendations of the study for purposes of the conference in 2023

Keywords: errors, medical, kenya, nurses, safety

Procedia PDF Downloads 221
888 Quantitative Evaluation of Supported Catalysts Key Properties from Electron Tomography Studies: Assessing Accuracy Using Material-Realistic 3D-Models

Authors: Ainouna Bouziane

Abstract:

The ability of Electron Tomography to recover the 3D structure of catalysts, with spatial resolution in the subnanometer scale, has been widely explored and reviewed in the last decades. A variety of experimental techniques, based either on Transmission Electron Microscopy (TEM) or Scanning Transmission Electron Microscopy (STEM) have been used to reveal different features of nanostructured catalysts in 3D, but High Angle Annular Dark Field imaging in STEM mode (HAADF-STEM) stands out as the most frequently used, given its chemical sensitivity and avoidance of imaging artifacts related to diffraction phenomena when dealing with crystalline materials. In this regard, our group has developed a methodology that combines image denoising by undecimated wavelet transforms (UWT) with automated, advanced segmentation procedures and parameter selection methods using CS-TVM (Compressed Sensing-total variation minimization) algorithms to reveal more reliable quantitative information out of the 3D characterization studies. However, evaluating the accuracy of the magnitudes estimated from the segmented volumes is also an important issue that has not been properly addressed yet, because a perfectly known reference is needed. The problem particularly complicates in the case of multicomponent material systems. To tackle this key question, we have developed a methodology that incorporates volume reconstruction/segmentation methods. In particular, we have established an approach to evaluate, in quantitative terms, the accuracy of TVM reconstructions, which considers the influence of relevant experimental parameters like the range of tilt angles, image noise level or object orientation. The approach is based on the analysis of material-realistic, 3D phantoms, which include the most relevant features of the system under analysis.

Keywords: electron tomography, supported catalysts, nanometrology, error assessment

Procedia PDF Downloads 62
887 An Automated Stock Investment System Using Machine Learning Techniques: An Application in Australia

Authors: Carol Anne Hargreaves

Abstract:

A key issue in stock investment is how to select representative features for stock selection. The objective of this paper is to firstly determine whether an automated stock investment system, using machine learning techniques, may be used to identify a portfolio of growth stocks that are highly likely to provide returns better than the stock market index. The second objective is to identify the technical features that best characterize whether a stock’s price is likely to go up and to identify the most important factors and their contribution to predicting the likelihood of the stock price going up. Unsupervised machine learning techniques, such as cluster analysis, were applied to the stock data to identify a cluster of stocks that was likely to go up in price – portfolio 1. Next, the principal component analysis technique was used to select stocks that were rated high on component one and component two – portfolio 2. Thirdly, a supervised machine learning technique, the logistic regression method, was used to select stocks with a high probability of their price going up – portfolio 3. The predictive models were validated with metrics such as, sensitivity (recall), specificity and overall accuracy for all models. All accuracy measures were above 70%. All portfolios outperformed the market by more than eight times. The top three stocks were selected for each of the three stock portfolios and traded in the market for one month. After one month the return for each stock portfolio was computed and compared with the stock market index returns. The returns for all three stock portfolios was 23.87% for the principal component analysis stock portfolio, 11.65% for the logistic regression portfolio and 8.88% for the K-means cluster portfolio while the stock market performance was 0.38%. This study confirms that an automated stock investment system using machine learning techniques can identify top performing stock portfolios that outperform the stock market.

Keywords: machine learning, stock market trading, logistic regression, cluster analysis, factor analysis, decision trees, neural networks, automated stock investment system

Procedia PDF Downloads 136