Search results for: contractual complexity
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1735

Search results for: contractual complexity

595 Increased Reaction and Movement Times When Text Messaging during Simulated Driving

Authors: Adriana M. Duquette, Derek P. Bornath

Abstract:

Reaction Time (RT) and Movement Time (MT) are important components of everyday life that have an effect on the way in which we move about our environment. These measures become even more crucial when an event can be caused (or avoided) in a fraction of a second, such as the RT and MT required while driving. The purpose of this study was to develop a more simple method of testing RT and MT during simulated driving with or without text messaging, in a university-aged population (n = 170). In the control condition, a randomly-delayed red light stimulus flashed on a computer interface after the participant began pressing the ‘gas’ pedal on a foot switch mat. Simple RT was defined as the time between the presentation of the light stimulus and the initiation of lifting the foot from the switch mat ‘gas’ pedal; while MT was defined as the time after the initiation of lifting the foot, to the initiation of depressing the switch mat ‘brake’ pedal. In the texting condition, upon pressing the ‘gas’ pedal, a ‘text message’ appeared on the computer interface in a dialog box that the participant typed on their cell phone while waiting for the light stimulus to turn red. In both conditions, the sequence was repeated 10 times, and an average RT (seconds) and average MT (seconds) were recorded. Condition significantly (p = .000) impacted overall RTs, as the texting condition (0.47 s) took longer than the no-texting (control) condition (0.34 s). Longer MTs were also recorded during the texting condition (0.28 s) than in the control condition (0.23 s), p = .001. Overall increases in Response Time (RT + MT) of 189 ms during the texting condition would equate to an additional 4.2 meters (to react to the stimulus and begin braking) if the participant had been driving an automobile at 80 km per hour. In conclusion, increasing task complexity due to the dual-task demand of text messaging during simulated driving caused significant increases in RT (41%), MT (23%) and Response Time (34%), thus further strengthening the mounting evidence against text messaging while driving.

Keywords: simulated driving, text messaging, reaction time, movement time

Procedia PDF Downloads 522
594 Low-Complex, High-Fidelity Two-Grades Cyclo-Olefin Copolymer (COC) Based Thermal Bonding Technique for Sealing a Thermoplastic Microfluidic Biosensor

Authors: Jorge Prada, Christina Cordes, Carsten Harms, Walter Lang

Abstract:

The development of microfluidic-based biosensors over the last years has shown an increasing employ of thermoplastic polymers as constitutive material. Their low-cost production, high replication fidelity, biocompatibility and optical-mechanical properties are sought after for the implementation of disposable albeit functional lab-on-chip solutions. Among the range of thermoplastic materials on use, the Cyclo-Olefin Copolymer (COC) stands out due to its optical transparency, which makes it a frequent choice as manufacturing material for fluorescence-based biosensors. Moreover, several processing techniques to complete a closed COC microfluidic biosensor have been discussed in the literature. The reported techniques differ however in their implementation, and therefore potentially add more or less complexity when using it in a mass production process. This work introduces and reports results on the application of a purely thermal bonding process between COC substrates, which were produced by the hot-embossing process, and COC foils containing screen-printed circuits. The proposed procedure takes advantage of the transition temperature difference between two COC grades foils to accomplish the sealing of the microfluidic channels. Patterned heat injection to the COC foil through the COC substrate is applied, resulting in consistent channel geometry uniformity. Measurements on bond strength and bursting pressure are shown, suggesting that this purely thermal bonding process potentially renders a technique which can be easily adapted into the thermoplastic microfluidic chip production workflow, while enables a low-cost as well as high-quality COC biosensor manufacturing process.

Keywords: biosensor, cyclo-olefin copolymer, hot embossing, thermal bonding, thermoplastics

Procedia PDF Downloads 238
593 Maintaining Experimental Consistency in Geomechanical Studies of Methane Hydrate Bearing Soils

Authors: Lior Rake, Shmulik Pinkert

Abstract:

Methane hydrate has been found in significant quantities in soils offshore within continental margins and in permafrost within arctic regions where low temperature and high pressure are present. The mechanical parameters for geotechnical engineering are commonly evaluated in geomechanical laboratories adapted to simulate the environmental conditions of methane hydrate-bearing sediments (MHBS). Due to the complexity and high cost of natural MHBS sampling, most laboratory investigations are conducted on artificially formed samples. MHBS artificial samples can be formed using different hydrate formation methods in the laboratory, where methane gas and water are supplied into the soil pore space under the methane hydrate phase conditions. The most commonly used formation method is the excess gas method which is considered a relatively simple, time-saving, and repeatable testing method. However, there are several differences in the procedures and techniques used to produce the hydrate using the excess gas method. As a result of the difference between the test facilities and the experimental approaches that were carried out in previous studies, different measurement criteria and analyses were proposed for MHBS geomechanics. The lack of uniformity among the various experimental investigations may adversely impact the reliability of integrating different data sets for unified mechanical model development. In this work, we address some fundamental aspects relevant to reliable MHBS geomechanical investigations, such as hydrate homogeneity in the sample, the hydrate formation duration criterion, the hydrate-saturation evaluation method, and the effect of temperature measurement accuracy. Finally, a set of recommendations for repeatable and reliable MHBS formation will be suggested for future standardization of MHBS geomechanical investigation.

Keywords: experimental study, laboratory investigation, excess gas, hydrate formation, standardization, methane hydrate-bearing sediment

Procedia PDF Downloads 57
592 A Cloud-Based Mobile Auditing Tools for Muslim-Friendly Hospitality Services

Authors: Mohd Iskandar Illyas Tan, Zuhra Junaida Mohamad Husny, Farawahida Mohd Yusof

Abstract:

The potentials of Muslim-friendly hospitality services bring huge opportunities to operators (hoteliers, tourist guides, and travel agents), especially among the Muslim countries. In order to provide guidelines that facilitate the operations among these operators, standards and manuals have been developing by the authorities. Among the challenges is the applicability and complexity of the standard to be adopted in the real world. Mobile digital technology can be implemented to overcome those challenges. A prototype has been developed to help operators and authorities to assess their readiness in complying with MS2610:2015. This study analyzes the of mobile digital technology characteristics that are suitable for the user in conducting sharia’ compliant hospitality audit. A focus group study was conducted in the state of Penang, Malaysia that involves operators (hoteliers, tourist guide, and travel agents) as well as agencies (Islamic Tourism Center, Penang Islamic Affairs Department, Malaysian Standard) that involved directly in the implementation of the certification. Both groups were given the 3 weeks to test and provide feedback on the usability of the mobile applications in order to conduct an audit on their readiness towards the Muslim-friendly hospitality services standard developed by the Malaysian Standard. The feedbacks were analyzed and the overall results show that three criteria (ease of use, completeness and fast to complete) show the highest responses among both groups for the mobile application. This study provides the evidence that the mobile application development has huge potentials to be implemented by the Muslim-friendly hospitality services operator and agencies.

Keywords: hospitality, innovation, audit, compliance, mobile application

Procedia PDF Downloads 131
591 Service Interactions Coordination Using a Declarative Approach: Focuses on Deontic Rule from Semantics of Business Vocabulary and Rules Models

Authors: Nurulhuda A. Manaf, Nor Najihah Zainal Abidin, Nur Amalina Jamaludin

Abstract:

Coordinating service interactions are a vital part of developing distributed applications that are built up as networks of autonomous participants, e.g., software components, web services, online resources, involve a collaboration between a diverse number of participant services on different providers. The complexity in coordinating service interactions reflects how important the techniques and approaches require for designing and coordinating the interaction between participant services to ensure the overall goal of a collaboration between participant services is achieved. The objective of this research is to develop capability of steering a complex service interaction towards a desired outcome. Therefore, an efficient technique for modelling, generating, and verifying the coordination of service interactions is developed. The developed model describes service interactions using service choreographies approach and focusing on a declarative approach, advocating an Object Management Group (OMG) standard, Semantics of Business Vocabulary and Rules (SBVR). This model, namely, SBVR model for service choreographies focuses on a declarative deontic rule expressing both obligation and prohibition, which can be more useful in working with coordinating service interactions. The generated SBVR model is then be formulated and be transformed into Alloy model using Alloy Analyzer for verifying the generated SBVR model. The transformation of SBVR into Alloy allows to automatically generate the corresponding coordination of service interactions (service choreography), hence producing an immediate instance of execution that satisfies the constraints of the specification and verifies whether a specific request can be realised in the given choreography in the generated choreography.

Keywords: service choreography, service coordination, behavioural modelling, complex interactions, declarative specification, verification, model transformation, semantics of business vocabulary and rules, SBVR

Procedia PDF Downloads 151
590 The Effect of Principled Human Resource Management and Training Based on Existing Standards in Order to Improve the Quality of Construction Projects

Authors: Arsalan Salahi

Abstract:

Today, the number of changes in the construction industry and urban mass house building is increasing, which makes you need to pay more attention to targeted planning for human resource management and training. The human resources working in the construction industry have various problems and deficiencies, and in order to solve these problems, there is a need for basic management and training of these people in order to lower the construction costs and increase the quality of the projects, especially in mass house building projects. The success of any project in reaching short and long-term professional goals depends on the efficient combination of work tools, financial resources, raw materials, and most importantly, human resources. Today, due to the complexity and diversity of each project, specialized management fields have emerged to maximize the potential benefits of each component of that project. Human power is known as the most important resource in construction projects for its successful implementation, but unfortunately, due to the low cost of human power compared to other resources, such as materials and machinery, little attention is paid to it. With the correct management and training of human resources, which depends on its correct planning and development, it is possible to improve the performance of construction projects. In this article, the training and motivation of construction industry workers and their effects on the effectiveness of projects in this industry have been researched. In this regard, some barriers to the training and motivation of construction workers and personnel have been identified and solutions have been provided for construction companies. Also, the impact of workers and unskilled people on the efficiency of construction projects is investigated. The results of the above research show that by increasing the use of correct and basic training for human resources, we will see positive results and effects on the performance of construction projects.

Keywords: human resources, construction industry, principled training, skilled and unskilled workers

Procedia PDF Downloads 92
589 Ultrathin NaA Zeolite Membrane in Solvent Recovery: Preparation and Application

Authors: Eng Toon Saw, Kun Liang Ang, Wei He, Xuecheng Dong, Seeram Ramakrishna

Abstract:

Solvent recovery process is receiving utmost attention in recent year due to the scarcity of natural resource and consciousness of circular economy in chemical and pharmaceutical manufacturing process. Solvent dehydration process is one of the important process to recover and to purify the solvent for reuse. Due to the complexity of solvent waste or wastewater effluent produced in pharmaceutical industry resulting the wastewater treatment process become complicated, thus an alternative solution is to recover the valuable solvent in solvent waste. To treat solvent waste and to upgrade solvent purity, membrane pervaporation process is shown to be a promising technology due to the energy intensive and low footprint advantages. Ceramic membrane is adopted as solvent dehydration membrane owing to the chemical and thermal stability properties as compared to polymeric membrane. NaA zeolite membrane is generally used as solvent dehydration process because of its narrow and distinct pore size and high hydrophilicity. NaA zeolite membrane has been mainly applied in alcohol dehydration in fermentation process. At this stage, the membrane performance exhibits high separation factor with low flux using tubular ceramic membrane. Thus, defect free and ultrathin NaA membrane should be developed to increase water flux. Herein, we report a simple preparation protocol to prepare ultrathin NaA zeolite membrane supported on tubular ceramic membrane by controlling the seed size synthesis, seeding methods and conditions, ceramic substrate surface pore size selection and secondary growth conditions. The microstructure and morphology of NaA zeolite membrane will be examined and reported. Moreover, the membrane separation performance and stability will also be reported in isopropanol dehydration, ketone dehydration and ester dehydration particularly for the application in pharmaceutical industry.

Keywords: ceramic membrane, NaA zeolite, pharmaceutical industry, solvent recovery

Procedia PDF Downloads 244
588 Water Re-Use Optimization in a Sugar Platform Biorefinery Using Municipal Solid Waste

Authors: Leo Paul Vaurs, Sonia Heaven, Charles Banks

Abstract:

Municipal solid waste (MSW) is a virtually unlimited source of lignocellulosic material in the form of a waste paper/cardboard mixture which can be converted into fermentable sugars via cellulolytic enzyme hydrolysis in a biorefinery. The extraction of the lignocellulosic fraction and its preparation, however, are energy and water demanding processes. The waste water generated is a rich organic liquor with a high Chemical Oxygen Demand that can be partially cleaned while generating biogas in an Upflow Anaerobic Sludge Blanket bioreactor and be further re-used in the process. In this work, an experiment was designed to determine the critical contaminant concentrations in water affecting either anaerobic digestion or enzymatic hydrolysis by simulating multiple water re-circulations. It was found that re-using more than 16.5 times the same water could decrease the hydrolysis yield by up to 65 % and led to a complete granules desegregation. Due to the complexity of the water stream, the contaminant(s) responsible for the performance decrease could not be identified but it was suspected to be caused by sodium, potassium, lipid accumulation for the anaerobic digestion (AD) process and heavy metal build-up for enzymatic hydrolysis. The experimental data were incorporated into a Water Pinch technology based model that was used to optimize the water re-utilization in the modelled system to reduce fresh water requirement and wastewater generation while ensuring all processes performed at optimal level. Multiple scenarios were modelled in which sub-process requirements were evaluated in term of importance, operational costs and impact on the CAPEX. The best compromise between water usage, AD and enzymatic hydrolysis yield was determined for each assumed contaminant degradations by anaerobic granules. Results from the model will be used to build the first MSW based biorefinery in the USA.

Keywords: anaerobic digestion, enzymatic hydrolysis, municipal solid waste, water optimization

Procedia PDF Downloads 317
587 An Analysis on Clustering Based Gene Selection and Classification for Gene Expression Data

Authors: K. Sathishkumar, V. Thiagarasu

Abstract:

Due to recent advances in DNA microarray technology, it is now feasible to obtain gene expression profiles of tissue samples at relatively low costs. Many scientists around the world use the advantage of this gene profiling to characterize complex biological circumstances and diseases. Microarray techniques that are used in genome-wide gene expression and genome mutation analysis help scientists and physicians in understanding of the pathophysiological mechanisms, in diagnoses and prognoses, and choosing treatment plans. DNA microarray technology has now made it possible to simultaneously monitor the expression levels of thousands of genes during important biological processes and across collections of related samples. Elucidating the patterns hidden in gene expression data offers a tremendous opportunity for an enhanced understanding of functional genomics. However, the large number of genes and the complexity of biological networks greatly increase the challenges of comprehending and interpreting the resulting mass of data, which often consists of millions of measurements. A first step toward addressing this challenge is the use of clustering techniques, which is essential in the data mining process to reveal natural structures and identify interesting patterns in the underlying data. This work presents an analysis of several clustering algorithms proposed to deals with the gene expression data effectively. The existing clustering algorithms like Support Vector Machine (SVM), K-means algorithm and evolutionary algorithm etc. are analyzed thoroughly to identify the advantages and limitations. The performance evaluation of the existing algorithms is carried out to determine the best approach. In order to improve the classification performance of the best approach in terms of Accuracy, Convergence Behavior and processing time, a hybrid clustering based optimization approach has been proposed.

Keywords: microarray technology, gene expression data, clustering, gene Selection

Procedia PDF Downloads 323
586 Schedule Risk Management for Complex Projects: The Royal Research Ship: Sir David Attenborough Case Study

Authors: Chatelier Charlene, Oyegoke Adekunle, Ajayi Saheed, Jeffries Andrew

Abstract:

This study seeks to understand Schedule Risk Assessments as a priori for better performance whilst exploring the strategies employed to deliver complex projects like the New Polar research ship. This high-profile vessel was offered to Natural Environment Research Council and British Antarctic Survey (BAS) by Cammell Laird Shipbuilders. The Research Ship was designed to support science in extreme environments, with the expectancy to provide a wide range of specialist scientific facilities, instruments, and laboratories to conduct research over multiple disciplines. Aim: The focus is to understand the allocation and management of schedule risk on such a Major Project. Hypothesising that "effective management of schedule risk management" could be the most critical factor in determining whether the intended benefits mentioned are delivered within time and cost constraints. Objective 1: Firstly, the study seeks to understand the allocation and management of schedule risk in Major Projects. Objective 2: Secondly, it explores "effective management of schedule risk management" as the most critical factor determining the delivery of intended benefits. Methodology: This study takes a retrospective review of schedule risk management and how it influences project performance using a case study approach for the RRS (Royal Research Ship) Sir David Attenborough. Research Contribution: The outcomes of this study will contribute to a better understanding of project performance whilst building on its under-researched relationship to schedule risk management for complex projects. The outcomes of this paper will guide further research on project performance and enable the understanding of how risk-based estimates over time impact the overall risk management of the project.

Keywords: complexity, major projects, performance management, schedule risk management, uncertainty

Procedia PDF Downloads 94
585 Ectopic Pregnancy: A Case of Consecutive Occurrences of Different Types

Authors: Wania Mohammad Akram, Swetha Kannan, Urooj Shahid, Aisha Sajjad

Abstract:

Ovarian ectopic pregnancy, a rare manifestation of ectopic gestation, involves the implantation of a fertilized egg on the ovarian surface. This condition poses diagnostic challenges and is associated with significant maternal morbidity if not promptly managed. This report presents the case of a 33-year-old nulliparous woman with a history of polycystic ovary syndrome (PCOS) undergoing ovulation induction therapy. Following her first conception in October 2021, she presented with symptoms of per vaginal spotting and low back pain, prompting a diagnosis of left adnexal ectopic pregnancy confirmed by transvaginal ultrasound and serum beta-human chorionic gonadotropin (B-HCG) levels. Medical management with methotrexate was initiated successfully. In August 2022, the patient conceived again, with subsequent ultrasound revealing a large pelvic collection suggestive of a complex ectopic pregnancy involving both ovaries. Despite initial stability, she developed abdominal pain necessitating emergency laparoscopy, which revealed an ovarian ectopic pregnancy with hemoperitoneum. Laparotomy was performed due to the complexity of the presentation, and histopathology confirmed viable chorionic villi within ovarian tissue. This case underscores the clinical management challenges posed by ovarian ectopic pregnancies, particularly in patients with previous ectopic pregnancies. The discussion reviews current literature on diagnostic modalities, treatment strategies, and outcomes associated with ovarian ectopic pregnancies, emphasizing the role of surgical intervention in cases refractory to conservative management. Tailored approaches considering individual patient factors are crucial to optimize outcomes and preserve fertility in such complex scenarios.

Keywords: obgyn, ovarian ectopic pregnancy, laproscopy, pcos

Procedia PDF Downloads 34
584 Examining the Relationship between Concussion and Neurodegenerative Disorders: A Review on Amyotrophic Lateral Sclerosis and Alzheimer’s Disease

Authors: Edward Poluyi, Eghosa Morgan, Charles Poluyi, Chibuikem Ikwuegbuenyi, Grace Imaguezegie

Abstract:

Background: Current epidemiological studies have examined the associations between moderate and severe traumatic brain injury (TBI) and their risks of developing neurodegenerative diseases. Concussion, also known as mild TBI (mTBI), is however quite distinct from moderate or severe TBIs. Only few studies in this burgeoning area have examined concussion—especially repetitive episodes—and neurodegenerative diseases. Thus, no definite relationship has been established between them. Objectives : This review will discuss the available literature linking concussion and amyotrophic lateral sclerosis (ALS) and Alzheimer’s disease (AD). Materials and Methods: Given the complexity of this subject, a realistic review methodology was selected which includes clarifying the scope and developing a theoretical framework, developing a search strategy, selection and appraisal, data extraction, and synthesis. A detailed literature matrix was set out in order to get relevant and recent findings on this topic. Results: Presently, there is no objective clinical test for the diagnosis of concussion because the features are less obvious on physical examination. Absence of an objective test in diagnosing concussion sometimes leads to skepticism when confirming the presence or absence of concussion. Intriguingly, several possible explanations have been proposed in the pathological mechanisms that lead to the development of some neurodegenerative disorders (such as ALS and AD) and concussion but the two major events are deposition of tau proteins (abnormal microtubule proteins) and neuroinflammation, which ranges from glutamate excitotoxicity pathways and inflammatory pathways (which leads to a rise in the metabolic demands of microglia cells and neurons), to mitochondrial function via the oxidative pathways.

Keywords: amyotrophic lateral sclerosis, Alzheimer's disease, mild traumatic brain injury, neurodegeneration

Procedia PDF Downloads 88
583 Radiation Risks for Nurses: The Unrecognized Consequences of ERCP Procedures

Authors: Ava Zarif Sanayei, Sedigheh Sina

Abstract:

Despite the advancement of radiation-free interventions in the gastrointestinal and hepatobiliary fields, endoscopy and endoscopic retrograde cholangiopancreatography (ERCP) remain indispensable procedures that necessitate radiation exposure. ERCP, in particular, relies heavily on radiation-guided imaging to ensure precise delivery of therapy. Meanwhile, interventional radiology (IR) procedures also utilize imaging modalities like X-rays and CT scans to guide therapy, often under local anesthesia via small needle insertion. However, the complexity of these procedures raises concerns about radiation exposure to healthcare professionals, including nurses, who play a crucial role in these interventions. This study aims to assess the radiation exposure to the hands and fingers of nurses 1 and 2, who are directly involved in ERCP procedures utilizing (TLD-100) dosimeters at the Gastrointestinal Endoscopy department of a clinic in Shiraz, Iran. The dosimeters were initially calibrated using various phantoms and then a group was prepared and used over a two-month period. For personal equivalent dose measurement, two TLD chips were mounted on a finger ring to monitor exposure to the hands and fingers. Upon completion of the monitoring period, the TLDs were analyzed using a TLD reader, showing that Nurse 1 received an equivalent dose of 298.26 µSv and Nurse 2 received an equivalent dose of 195.39 µSv. The investigation revealed that the total radiation exposure to the nurses did not exceed the annual limit for occupational exposure. Nevertheless, it is essential to prioritize radiation protection measures to prevent potential harm. The study showed that positioning staff members and placing two nurses in a specific location contributed to somehow equal doses. To reduce exposure further, we suggest providing education and training on radiation safety principles, particularly for technologists.

Keywords: dose measurement, ERCP, interventional radiology, medical imaging

Procedia PDF Downloads 33
582 A Compact Via-less Ultra-Wideband Microstrip Filter by Utilizing Open-Circuit Quarter Wavelength Stubs

Authors: Muhammad Yasir Wadood, Fatemeh Babaeian

Abstract:

By developing ultra-wideband (UWB) systems, there is a high demand for UWB filters with low insertion loss, wide bandwidth, and having a planar structure which is compatible with other components of the UWB system. A microstrip interdigital filter is a great option for designing UWB filters. However, the presence of via holes in this structure creates difficulties in the fabrication procedure of the filter. Especially in the higher frequency band, any misalignment of the drilled via hole with the Microstrip stubs causes large errors in the measurement results compared to the desired results. Moreover, in this case (high-frequency designs), the line width of the stubs are very narrow, so highly precise small via holes are required to be implemented, which increases the cost of fabrication significantly. Also, in this case, there is a risk of having fabrication errors. To combat this issue, in this paper, a via-less UWB microstrip filter is proposed which is designed based on a modification of a conventional inter-digital bandpass filter. The novel approaches in this filter design are 1) replacement of each via hole with a quarter-wavelength open circuit stub to avoid the complexity of manufacturing, 2) using a bend structure to reduce the unwanted coupling effects and 3) minimising the size. Using the proposed structure, a UWB filter operating in the frequency band of 3.9-6.6 GHz (1-dB bandwidth) is designed and fabricated. The promising results of the simulation and measurement are presented in this paper. The selected substrate for these designs was Rogers RO4003 with a thickness of 20 mils. This is a common substrate in most of the industrial projects. The compact size of the proposed filter is highly beneficial for applications which require a very miniature size of hardware.

Keywords: band-pass filters, inter-digital filter, microstrip, via-less

Procedia PDF Downloads 155
581 Improving Fake News Detection Using K-means and Support Vector Machine Approaches

Authors: Kasra Majbouri Yazdi, Adel Majbouri Yazdi, Saeid Khodayi, Jingyu Hou, Wanlei Zhou, Saeed Saedy

Abstract:

Fake news and false information are big challenges of all types of media, especially social media. There is a lot of false information, fake likes, views and duplicated accounts as big social networks such as Facebook and Twitter admitted. Most information appearing on social media is doubtful and in some cases misleading. They need to be detected as soon as possible to avoid a negative impact on society. The dimensions of the fake news datasets are growing rapidly, so to obtain a better result of detecting false information with less computation time and complexity, the dimensions need to be reduced. One of the best techniques of reducing data size is using feature selection method. The aim of this technique is to choose a feature subset from the original set to improve the classification performance. In this paper, a feature selection method is proposed with the integration of K-means clustering and Support Vector Machine (SVM) approaches which work in four steps. First, the similarities between all features are calculated. Then, features are divided into several clusters. Next, the final feature set is selected from all clusters, and finally, fake news is classified based on the final feature subset using the SVM method. The proposed method was evaluated by comparing its performance with other state-of-the-art methods on several specific benchmark datasets and the outcome showed a better classification of false information for our work. The detection performance was improved in two aspects. On the one hand, the detection runtime process decreased, and on the other hand, the classification accuracy increased because of the elimination of redundant features and the reduction of datasets dimensions.

Keywords: clustering, fake news detection, feature selection, machine learning, social media, support vector machine

Procedia PDF Downloads 176
580 Adaptive Certificate-Based Mutual Authentication Protocol for Mobile Grid Infrastructure

Authors: H. Parveen Begam, M. A. Maluk Mohamed

Abstract:

Mobile Grid Computing is an environment that allows sharing and coordinated use of diverse resources in dynamic, heterogeneous and distributed environment using different types of electronic portable devices. In a grid environment the security issues are like authentication, authorization, message protection and delegation handled by GSI (Grid Security Infrastructure). Proving better security between mobile devices and grid infrastructure is a major issue, because of the open nature of wireless networks, heterogeneous and distributed environments. In a mobile grid environment, the individual computing devices may be resource-limited in isolation, as an aggregated sum, they have the potential to play a vital role within the mobile grid environment. Some adaptive methodology or solution is needed to solve the issues like authentication of a base station, security of information flowing between a mobile user and a base station, prevention of attacks within a base station, hand-over of authentication information, communication cost of establishing a session key between mobile user and base station, computing complexity of achieving authenticity and security. The sharing of resources of the devices can be achieved only through the trusted relationships between the mobile hosts (MHs). Before accessing the grid service, the mobile devices should be proven authentic. This paper proposes the dynamic certificate based mutual authentication protocol between two mobile hosts in a mobile grid environment. The certificate generation process is done by CA (Certificate Authority) for all the authenticated MHs. Security (because of validity period of the certificate) and dynamicity (transmission time) can be achieved through the secure service certificates. Authentication protocol is built on communication services to provide cryptographically secured mechanisms for verifying the identity of users and resources.

Keywords: mobile grid computing, certificate authority (CA), SSL/TLS protocol, secured service certificates

Procedia PDF Downloads 305
579 Compliance of Systematic Reviews in Ophthalmology with the PRISMA Statement

Authors: Seon-Young Lee, Harkiran Sagoo, Reem Farwana, Katharine Whitehurst, Alex Fowler, Riaz Agha

Abstract:

Background/Aims: Systematic reviews and meta-analysis are becoming increasingly important way of summarizing research evidence. Researches in ophthalmology may represent further challenges, due to their potential complexity in study design. The aim of our study was to determine the reporting quality of systematic reviews and meta-analysis in ophthalmology with the PRISMA statement, by assessing the articles published between 2010 and 2015 from five major journals with the highest impact factor. Methods: MEDLINE and EMBASE were used to search systematic reviews published between January 2010 and December 2015, in 5 major ophthalmology journals: Progress in Retinal and Eye Research, Ophthalmology, Archives of Ophthalmology, American Journal of Ophthalmology, Journal of the American Optometric Association. Screening, identification, and scoring of articles were performed independently by two teams, followed by statistical analysis including the median, range, and 95% CIs. Results: 115 articles were involved. The median PRISMA score was 15 of 27 items (56%), with a range of 5-26 (19-96%) and 95% CI 13.9-16.1 (51-60%). Compliance was highest in items related to the description of rationale (item 3,100%) and inclusion of a structured summary in the abstract (item 2, 90%), while poorest in indication of review protocol and registration (item 5, 9%), specification of risk of bias affecting the cumulative evidence (item 15, 24%) and description of clear objectives in introduction (item 4, 26%). Conclusion: The reporting quality of systematic reviews and meta-analysis in ophthalmology need significant improvement. While the use of PRISMA criteria as a guideline before journal submission is recommended, additional research identifying potential barriers may be required to improve the compliance to the PRISMA guidelines.

Keywords: systematic reviews, meta-analysis, research methodology, reporting quality, PRISMA, ophthalmology

Procedia PDF Downloads 262
578 The Regional Novel in India: Its Emergence and Trajectory

Authors: Aruna Bommareddi

Abstract:

The journey of the novel is well examined in Indian academia as an offshoot of the novel in English. There have been many attempts to understand aspects of the early novel in India which shared a commonality with the English novel. The regional novel has had an entirely different trajectory which is mapped in the paper. The main focus of the paper would be to look at the historical emergence of the genre of the regional novel in Indian Literatures with specific reference to Kannada, Hindi, and Bengali. The selection of these languages is guided not only by familiarity with these languages as also based on the significance that these languages enjoy in the sub-continent and for the emergence of the regional novel as a specific category in these languages. The regional novels under study are Phaneeswaranath Renu’s Maila Anchal, Tarashankar Bandopadhyaya’s Ganadevata, and Kuvempu’s House of Kanuru for exploration of the themes of its emergence and some aspects of the regional novel common to and different from each other. The paper would explore the various movements that have shaped the genre regional novel in these Literatures. Though Phaneeswarnath Renu’s Maila Anchal is published in 1956, the novel is set in pre-Independent India and therefore shares a commonality of themes with the other two novels, House of Kanuru and Ganadevata. All three novels explore themes of superstition, ignorance, poverty, and the interventions of educated youth to salvage the crises in these backward regional worlds. In fact, it was Renu who assertively declared that he was going to write a regional novel and hence the tile of the first regional novel in Hindi is Maila Anchal meaning the soiled border. In Hindi, anchal also means the region therefore, the title is suggestive of a dirty region as well. The novel exposes the squalor, ignorance, and the conflict ridden life of the village or region as opposed to the rosy image of the village in literature. With this, all such novels which depicted conflicts of the region got recognized as regional novels even though they may have been written prior to Renu’s declaration. All three novels under study succeed in bringing out the complexity of rural life at a given point of time in its history.

Keywords: bengali, hindi, kannada, regional novel, telugu

Procedia PDF Downloads 78
577 A Novel Method for Face Detection

Authors: H. Abas Nejad, A. R. Teymoori

Abstract:

Facial expression recognition is one of the open problems in computer vision. Robust neutral face recognition in real time is a major challenge for various supervised learning based facial expression recognition methods. This is due to the fact that supervised methods cannot accommodate all appearance variability across the faces with respect to race, pose, lighting, facial biases, etc. in the limited amount of training data. Moreover, processing each and every frame to classify emotions is not required, as the user stays neutral for the majority of the time in usual applications like video chat or photo album/web browsing. Detecting neutral state at an early stage, thereby bypassing those frames from emotion classification would save the computational power. In this work, we propose a light-weight neutral vs. emotion classification engine, which acts as a preprocessor to the traditional supervised emotion classification approaches. It dynamically learns neutral appearance at Key Emotion (KE) points using a textural statistical model, constructed by a set of reference neutral frames for each user. The proposed method is made robust to various types of user head motions by accounting for affine distortions based on a textural statistical model. Robustness to dynamic shift of KE points is achieved by evaluating the similarities on a subset of neighborhood patches around each KE point using the prior information regarding the directionality of specific facial action units acting on the respective KE point. The proposed method, as a result, improves ER accuracy and simultaneously reduces the computational complexity of ER system, as validated on multiple databases.

Keywords: neutral vs. emotion classification, Constrained Local Model, procrustes analysis, Local Binary Pattern Histogram, statistical model

Procedia PDF Downloads 336
576 Rights, Differences and Inclusion: The Role of Transdisciplinary Approach in the Education for Diversity

Authors: Ana Campina, Maria Manuela Magalhaes, Eusebio André Machado, Cristina Costa-Lobo

Abstract:

Inclusive school advocates respect for differences, for equal opportunities and for a quality education for all, including for students with special educational needs. In the pursuit of educational equity, guaranteeing equality in access and results, it becomes the responsibility of the school to recognize students' needs, adapting to the various styles and rhythms of learning, ensuring the adequacy of curricula, strategies and resources, materials and humans. This paper presents a set of theoretical reflections in the disciplinary interface between legal and education sciences, school administration and management, with the aim of understand the real inclusion characteristics in a balance with the inclusion policies and the need(s) of an education for Human Rights, especially for diversity. Considering the actual social complexity but the important education instruments and strategies, mostly patented in the policies, this paper aims expose the existing contexts opposed to the laws, policies and inclusion educational needs. More than a single study, this research aims to develop a map of the reality and the guidelines to implement the action. The results point to the usefulness and pertinence of a school in which educational managers, teachers, parents, and students, are involved in the creation, implementation and monitoring of flexible curricula and adapted to the educational needs of students, promoting a collaborative work among teachers. We are then faced with a scenario that points to the need to reflect on the legislation and curricular management of inclusive classes and to operationalize the processes of elaboration of curricular adaptations and differentiation in the classroom. The transdisciplinary is a pedagogic and social education perfect approach using the Human Rights binomio – teaching and learning – supported by the inclusion laws according to the realistic needs for an effective successful society construction.

Keywords: rights, transdisciplinary, inclusion policies, education for diversity

Procedia PDF Downloads 387
575 A Machine Learning Approach for Detecting and Locating Hardware Trojans

Authors: Kaiwen Zheng, Wanting Zhou, Nan Tang, Lei Li, Yuanhang He

Abstract:

The integrated circuit industry has become a cornerstone of the information society, finding widespread application in areas such as industry, communication, medicine, and aerospace. However, with the increasing complexity of integrated circuits, Hardware Trojans (HTs) implanted by attackers have become a significant threat to their security. In this paper, we proposed a hardware trojan detection method for large-scale circuits. As HTs introduce physical characteristic changes such as structure, area, and power consumption as additional redundant circuits, we proposed a machine-learning-based hardware trojan detection method based on the physical characteristics of gate-level netlists. This method transforms the hardware trojan detection problem into a machine-learning binary classification problem based on physical characteristics, greatly improving detection speed. To address the problem of imbalanced data, where the number of pure circuit samples is far less than that of HTs circuit samples, we used the SMOTETomek algorithm to expand the dataset and further improve the performance of the classifier. We used three machine learning algorithms, K-Nearest Neighbors, Random Forest, and Support Vector Machine, to train and validate benchmark circuits on Trust-Hub, and all achieved good results. In our case studies based on AES encryption circuits provided by trust-hub, the test results showed the effectiveness of the proposed method. To further validate the method’s effectiveness for detecting variant HTs, we designed variant HTs using open-source HTs. The proposed method can guarantee robust detection accuracy in the millisecond level detection time for IC, and FPGA design flows and has good detection performance for library variant HTs.

Keywords: hardware trojans, physical properties, machine learning, hardware security

Procedia PDF Downloads 145
574 Control of a Quadcopter Using Genetic Algorithm Methods

Authors: Mostafa Mjahed

Abstract:

This paper concerns the control of a nonlinear system using two different methods, reference model and genetic algorithm. The quadcopter is a nonlinear unstable system, which is a part of aerial robots. It is constituted by four rotors placed at the end of a cross. The center of this cross is occupied by the control circuit. Its motions are governed by six degrees of freedom: three rotations around 3 axes (roll, pitch and yaw) and the three spatial translations. The control of such system is complex, because of nonlinearity of its dynamic representation and the number of parameters, which it involves. Numerous studies have been developed to model and stabilize such systems. The classical PID and LQ correction methods are widely used. If the latter represent the advantage to be simple because they are linear, they reveal the drawback to require the presence of a linear model to synthesize. It also implies the complexity of the established laws of command because the latter must be widened on all the domain of flight of these quadcopter. Note that, if the classical design methods are widely used to control aeronautical systems, the Artificial Intelligence methods as genetic algorithms technique receives little attention. In this paper, we suggest comparing two PID design methods. Firstly, the parameters of the PID are calculated according to the reference model. In a second phase, these parameters are established using genetic algorithms. By reference model, we mean that the corrected system behaves according to a reference system, imposed by some specifications: settling time, zero overshoot etc. Inspired from the natural evolution of Darwin's theory advocating the survival of the best, John Holland developed this evolutionary algorithm. Genetic algorithm (GA) possesses three basic operators: selection, crossover and mutation. We start iterations with an initial population. Each member of this population is evaluated through a fitness function. Our purpose is to correct the behavior of the quadcopter around three axes (roll, pitch and yaw) with 3 PD controllers. For the altitude, we adopt a PID controller.

Keywords: quadcopter, genetic algorithm, PID, fitness, model, control, nonlinear system

Procedia PDF Downloads 431
573 Assessing the Financial Impact of Federal Benefit Program Enrollment on Low-income Households

Authors: Timothy Scheinert, Eliza Wright

Abstract:

Background: Link Health is a Boston-based non-profit leveraging in-person and digital platforms to promote health equity. Its primary aim is to financially support low-income individuals through enrollment in federal benefit programs. This study examines the monetary impact of enrollment in several benefit programs. Methodologies: Approximately 17,000 individuals have been screened for eligibility via digital outreach, community events, and in-person clinics. Enrollment and financial distributions are evaluated across programs, including the Affordable Connectivity Program (ACP), Lifeline, LIHEAP, Transitional Aid to Families with Dependent Children (TAFDC), and the Supplemental Nutrition Assistance Program (SNAP). Major Findings: A total of 1,895 individuals have successfully applied, collectively distributing an estimated $1,288,152.00 in aid. The largest contributors to this sum include: ACP: 1,149 enrollments, $413,640 distributed annually. Child Care Financial Assistance (CCFA): 15 enrollments, $240,000 distributed annually. Lifeline: 602 enrollments, $66,822 distributed annually. LIHEAP: 25 enrollments, $48,750 distributed annually. SNAP: 41 enrollments, $123,000 distributed annually. TAFDC: 21 enrollments, $341,760 distributed annually. Conclusions: These results highlight the role of targeted outreach and effective enrollment processes in promoting access to federal benefit programs. High enrollment rates in ACP and Lifeline demonstrate a considerable need for affordable broadband and internet services. Programs like CCFA and TAFDC, despite lower enrollment numbers, provide sizable support per individual. This analysis advocates for continued funding of federal benefit programs. Future efforts can be made to develop screening tools that identify eligibility for multiple programs and reduce the complexity of enrollment.

Keywords: benefits, childcare, connectivity, equity, nutrition

Procedia PDF Downloads 25
572 Decision Making in Medicine and Treatment Strategies

Authors: Kamran Yazdanbakhsh, Somayeh Mahmoudi

Abstract:

Three reasons make good use of the decision theory in medicine: 1. Increased medical knowledge and their complexity makes it difficult treatment information effectively without resorting to sophisticated analytical methods, especially when it comes to detecting errors and identify opportunities for treatment from databases of large size. 2. There is a wide geographic variability of medical practice. In a context where medical costs are, at least in part, by the patient, these changes raise doubts about the relevance of the choices made by physicians. These differences are generally attributed to differences in estimates of probabilities of success of treatment involved, and differing assessments of the results on success or failure. Without explicit criteria for decision, it is difficult to identify precisely the sources of these variations in treatment. 3. Beyond the principle of informed consent, patients need to be involved in decision-making. For this, the decision process should be explained and broken down. A decision problem is to select the best option among a set of choices. The problem is what is meant by "best option ", or know what criteria guide the choice. The purpose of decision theory is to answer this question. The systematic use of decision models allows us to better understand the differences in medical practices, and facilitates the search for consensus. About this, there are three types of situations: situations certain, risky situations, and uncertain situations: 1. In certain situations, the consequence of each decision are certain. 2. In risky situations, every decision can have several consequences, the probability of each of these consequences is known. 3. In uncertain situations, each decision can have several consequences, the probability is not known. Our aim in this article is to show how decision theory can usefully be mobilized to meet the needs of physicians. The decision theory can make decisions more transparent: first, by clarifying the data systematically considered the problem and secondly by asking a few basic principles should guide the choice. Once the problem and clarified the decision theory provides operational tools to represent the available information and determine patient preferences, and thus assist the patient and doctor in their choices.

Keywords: decision making, medicine, treatment strategies, patient

Procedia PDF Downloads 578
571 An Unsupervised Domain-Knowledge Discovery Framework for Fake News Detection

Authors: Yulan Wu

Abstract:

With the rapid development of social media, the issue of fake news has gained considerable prominence, drawing the attention of both the public and governments. The widespread dissemination of false information poses a tangible threat across multiple domains of society, including politics, economy, and health. However, much research has concentrated on supervised training models within specific domains, their effectiveness diminishes when applied to identify fake news across multiple domains. To solve this problem, some approaches based on domain labels have been proposed. By segmenting news to their specific area in advance, judges in the corresponding field may be more accurate on fake news. However, these approaches disregard the fact that news records can pertain to multiple domains, resulting in a significant loss of valuable information. In addition, the datasets used for training must all be domain-labeled, which creates unnecessary complexity. To solve these problems, an unsupervised domain knowledge discovery framework for fake news detection is proposed. Firstly, to effectively retain the multidomain knowledge of the text, a low-dimensional vector for each news text to capture domain embeddings is generated. Subsequently, a feature extraction module utilizing the unsupervisedly discovered domain embeddings is used to extract the comprehensive features of news. Finally, a classifier is employed to determine the authenticity of the news. To verify the proposed framework, a test is conducted on the existing widely used datasets, and the experimental results demonstrate that this method is able to improve the detection performance for fake news across multiple domains. Moreover, even in datasets that lack domain labels, this method can still effectively transfer domain knowledge, which can educe the time consumed by tagging without sacrificing the detection accuracy.

Keywords: fake news, deep learning, natural language processing, multiple domains

Procedia PDF Downloads 96
570 Efficient Compact Micro Dielectric Barrier Discharge (DBD) Plasma Reactor for Ozone Generation for Industrial Application in Liquid and Gas Phase Systems

Authors: D. Kuvshinov, A. Siswanto, J. Lozano-Parada, W. Zimmerman

Abstract:

Ozone is well known as a powerful fast reaction rate oxidant. The ozone based processes produce no by-product left as a non-reacted ozone returns back to the original oxygen molecule. Therefore an application of ozone is widely accepted as one of the main directions for a sustainable and clean technologies development. There are number of technologies require ozone to be delivered to specific points of a production network or reactors construction. Due to space constrains, high reactivity and short life time of ozone the use of ozone generators even of a bench top scale is practically limited. This requires development of mini/micro scale ozone generator which can be directly incorporated into production units. Our report presents a feasibility study of a new micro scale rector for ozone generation (MROG). Data on MROG calibration and indigo decomposition at different operation conditions are presented. At selected operation conditions with residence time of 0.25 s the process of ozone generation is not limited by reaction rate and the amount of ozone produced is a function of power applied. It was shown that the MROG is capable to produce ozone at voltage level starting from 3.5kV with ozone concentration of 5.28E-6 (mol/L) at 5kV. This is in line with data presented on numerical investigation for a MROG. It was shown that in compare to a conventional ozone generator, MROG has lower power consumption at low voltages and atmospheric pressure. The MROG construction makes it applicable for emerged and dry systems. With a robust compact design MROG can be used as incorporated unit for production lines of high complexity.

Keywords: dielectric barrier discharge (DBD), micro reactor, ozone, plasma

Procedia PDF Downloads 335
569 Interrogating Bishwas: Reimagining a Christian Neighbourhood in Kolkata, India

Authors: Abhijit Dasgupta

Abstract:

This paper explores the everyday lives of the Christians residing in a Bengali Christian neighborhood in Kolkata, termed here as the larger Christian para (para meaning neighborhood in Bengali). Through ethnography and reading of secondary sources, the paper discerns how various Christians across denominations – Protestants, Catholics and Pentecostals implicate the role of bishwas (faith and belief) in their interpersonal neighborhood relations. The paper attempts to capture the role of bishwas in producing, transforming and revising the meaning of 'neighbourhood' and 'neighbours' and puts forward the argument of the neighbourhood as a theological product. By interrogating and interpreting bishwas through everyday theological discussions and reflections, the paper examines and analyses the ways everyday theology becomes an essential source of power and knowledge for the Bengali Christians in reimagining their neighbourhood compared to the nearby Hindu neighbourhoods. Borrowing literature from everyday theology, faith and belief, the paper reads and analyses various interpretations of theological knowledge across denominations to probe the prominence of bishwas within the Christian community and its role in creating a difference in their place of dwelling. The paper argues that the meaning of neighbourhood is revisited through prayers, sermons and biblical verses. At the same time, the divisions and fissures are seen among Protestants and Catholics and also among native Bengali Protestants and non-native Protestant pastors, which informs us about the complexity of theology in constituting everyday life. Thus, the paper addresses theology's role in creating an ethical Christian neighbourhood amidst everyday tensions and hostilities of diverse religious persuasions. At the same time, it looks into the processes through which multiple theological knowledge leads to schism and interdenominational hostilities. By attempting to answer these questions, the paper brings out Christians' negotiation with the neighbourhood.

Keywords: anthropology, bishwas, christianity, neighbourhood, theology

Procedia PDF Downloads 86
568 Characterization of Chest Pain in Patients Consulting to the Emergency Department of a Health Institution High Level of Complexity during 2014-2015, Medellin, Colombia

Authors: Jorge Iván Bañol-Betancur, Lina María Martínez-Sánchez, María de los Ángeles Rodríguez-Gázquez, Estefanía Bahamonde-Olaya, Ana María Gutiérrez-Tamayo, Laura Isabel Jaramillo-Jaramillo, Camilo Ruiz-Mejía, Natalia Morales-Quintero

Abstract:

Acute chest pain is a distressing sensation between the diaphragm and the base of the neck and it represents a diagnostic challenge for any physician in the emergency department. Objective: To establish the main clinical and epidemiological characteristics of patients who present with chest pain to the emergency department in a private clinic from the city of Medellin, during 2014-2015. Methods: Cross-sectional retrospective observational study. Population and sample were patients who consulted for chest pain in the emergency department who met the eligibility criteria. The information was analyzed in SPSS program vr.21; qualitative variables were described through relative frequencies, and the quantitative through mean and standard deviation ‬or medians according to their distribution in the study population. Results: A total of 231 patients were evaluated, the mean age was 49.5 ± 19.9 years, 56.7% were females. The most frequent pathological antecedents were hypertension 35.5%, diabetes 10,8%, dyslipidemia 10.4% and coronary disease 5.2%. Regarding pain features, in 40.3% of the patients the pain began abruptly, in 38.2% it had a precordial location, for 20% of the cases physical activity acted as a trigger, and 60.6% was oppressive. Costochondritis was the most common cause of chest pain among patients with an established etiologic diagnosis, representing the 18.2%. Conclusions: Although the clinical features of pain reported coincide with the clinical presentation of an acute coronary syndrome, the most common cause of chest pain in study population was costochondritis instead, indicating that it is a differential diagnostic in the approach of patients with pain acute chest.

Keywords: acute coronary syndrome, chest pain, epidemiology, osteochondritis

Procedia PDF Downloads 339
567 Comparison of Different Artificial Intelligence-Based Protein Secondary Structure Prediction Methods

Authors: Jamerson Felipe Pereira Lima, Jeane Cecília Bezerra de Melo

Abstract:

The difficulty and cost related to obtaining of protein tertiary structure information through experimental methods, such as X-ray crystallography or NMR spectroscopy, helped raising the development of computational methods to do so. An approach used in these last is prediction of tridimensional structure based in the residue chain, however, this has been proved an NP-hard problem, due to the complexity of this process, explained by the Levinthal paradox. An alternative solution is the prediction of intermediary structures, such as the secondary structure of the protein. Artificial Intelligence methods, such as Bayesian statistics, artificial neural networks (ANN), support vector machines (SVM), among others, were used to predict protein secondary structure. Due to its good results, artificial neural networks have been used as a standard method to predict protein secondary structure. Recent published methods that use this technique, in general, achieved a Q3 accuracy between 75% and 83%, whereas the theoretical accuracy limit for protein prediction is 88%. Alternatively, to achieve better results, support vector machines prediction methods have been developed. The statistical evaluation of methods that use different AI techniques, such as ANNs and SVMs, for example, is not a trivial problem, since different training sets, validation techniques, as well as other variables can influence the behavior of a prediction method. In this study, we propose a prediction method based on artificial neural networks, which is then compared with a selected SVM method. The chosen SVM protein secondary structure prediction method is the one proposed by Huang in his work Extracting Physico chemical Features to Predict Protein Secondary Structure (2013). The developed ANN method has the same training and testing process that was used by Huang to validate his method, which comprises the use of the CB513 protein data set and three-fold cross-validation, so that the comparative analysis of the results can be made comparing directly the statistical results of each method.

Keywords: artificial neural networks, protein secondary structure, protein structure prediction, support vector machines

Procedia PDF Downloads 619
566 Comparison of Support Vector Machines and Artificial Neural Network Classifiers in Characterizing Threatened Tree Species Using Eight Bands of WorldView-2 Imagery in Dukuduku Landscape, South Africa

Authors: Galal Omer, Onisimo Mutanga, Elfatih M. Abdel-Rahman, Elhadi Adam

Abstract:

Threatened tree species (TTS) play a significant role in ecosystem functioning and services, land use dynamics, and other socio-economic aspects. Such aspects include ecological, economic, livelihood, security-based, and well-being benefits. The development of techniques for mapping and monitoring TTS is thus critical for understanding the functioning of ecosystems. The advent of advanced imaging systems and supervised learning algorithms has provided an opportunity to classify TTS over fragmenting landscape. Recently, vegetation maps have been produced using advanced imaging systems such as WorldView-2 (WV-2) and robust classification algorithms such as support vectors machines (SVM) and artificial neural network (ANN). However, delineation of TTS in a fragmenting landscape using high resolution imagery has widely remained elusive due to the complexity of the species structure and their distribution. Therefore, the objective of the current study was to examine the utility of the advanced WV-2 data for mapping TTS in the fragmenting Dukuduku indigenous forest of South Africa using SVM and ANN classification algorithms. The results showed the robustness of the two machine learning algorithms with an overall accuracy (OA) of 77.00% (total disagreement = 23.00%) for SVM and 75.00% (total disagreement = 25.00%) for ANN using all eight bands of WV-2 (8B). This study concludes that SVM and ANN classification algorithms with WV-2 8B have the potential to classify TTS in the Dukuduku indigenous forest. This study offers relatively accurate information that is important for forest managers to make informed decisions regarding management and conservation protocols of TTS.

Keywords: artificial neural network, threatened tree species, indigenous forest, support vector machines

Procedia PDF Downloads 514