Search results for: architecture complexity
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3252

Search results for: architecture complexity

1032 Development of Partial Discharge Defect Recognition and Status Diagnosis System with Adaptive Deep Learning

Authors: Chien-kuo Chang, Bo-wei Wu, Yi-yun Tang, Min-chiu Wu

Abstract:

This paper proposes a power equipment diagnosis system based on partial discharge (PD), which is characterized by increasing the readability of experimental data and the convenience of operation. This system integrates a variety of analysis programs of different data formats and different programming languages and then establishes a set of interfaces that can follow and expand the structure, which is also helpful for subsequent maintenance and innovation. This study shows a case of using the developed Convolutional Neural Networks (CNN) to integrate with this system, using the designed model architecture to simplify the complex training process. It is expected that the simplified training process can be used to establish an adaptive deep learning experimental structure. By selecting different test data for repeated training, the accuracy of the identification system can be enhanced. On this platform, the measurement status and partial discharge pattern of each equipment can be checked in real time, and the function of real-time identification can be set, and various training models can be used to carry out real-time partial discharge insulation defect identification and insulation state diagnosis. When the electric power equipment entering the dangerous period, replace equipment early to avoid unexpected electrical accidents.

Keywords: partial discharge, convolutional neural network, partial discharge analysis platform, adaptive deep learning

Procedia PDF Downloads 65
1031 Platform Integration for High-Throughput Functional Screening Applications

Authors: Karolis Leonavičius, Dalius Kučiauskas, Dangiras Lukošius, Arnoldas Jasiūnas, Kostas Zdanys, Rokas Stanislovas, Emilis Gegevičius, Žana Kapustina, Juozas Nainys

Abstract:

Screening throughput is a common bottleneck in many research areas, including functional genomics, drug discovery, and directed evolution. High-throughput screening techniques can be classified into two main categories: (i) affinity-based screening and (ii) functional screening. The first one relies on binding assays that provide information about the affinity of a test molecule for a target binding site. Binding assays are relatively easy to establish; however, they reveal no functional activity. In contrast, functional assays show an effect triggered by the interaction of a ligand at a target binding site. Functional assays might be based on a broad range of readouts, such as cell proliferation, reporter gene expression, downstream signaling, and other effects that are a consequence of ligand binding. Screening of large cell or gene libraries based on direct activity rather than binding affinity is now a preferred strategy in many areas of research as functional assays more closely resemble the context where entities of interest are anticipated to act. Droplet sorting is the basis of high-throughput functional biological screening, yet its applicability is limited due to the technical complexity of integrating high-performance droplet analysis and manipulation systems. As a solution, the Droplet Genomics Styx platform enables custom droplet sorting workflows, which are necessary for the development of early-stage or complex biological therapeutics or industrially important biocatalysts. The poster will focus on the technical design considerations of Styx in the context of its application spectra.

Keywords: functional screening, droplet microfluidics, droplet sorting, dielectrophoresis

Procedia PDF Downloads 118
1030 Increased Reaction and Movement Times When Text Messaging during Simulated Driving

Authors: Adriana M. Duquette, Derek P. Bornath

Abstract:

Reaction Time (RT) and Movement Time (MT) are important components of everyday life that have an effect on the way in which we move about our environment. These measures become even more crucial when an event can be caused (or avoided) in a fraction of a second, such as the RT and MT required while driving. The purpose of this study was to develop a more simple method of testing RT and MT during simulated driving with or without text messaging, in a university-aged population (n = 170). In the control condition, a randomly-delayed red light stimulus flashed on a computer interface after the participant began pressing the ‘gas’ pedal on a foot switch mat. Simple RT was defined as the time between the presentation of the light stimulus and the initiation of lifting the foot from the switch mat ‘gas’ pedal; while MT was defined as the time after the initiation of lifting the foot, to the initiation of depressing the switch mat ‘brake’ pedal. In the texting condition, upon pressing the ‘gas’ pedal, a ‘text message’ appeared on the computer interface in a dialog box that the participant typed on their cell phone while waiting for the light stimulus to turn red. In both conditions, the sequence was repeated 10 times, and an average RT (seconds) and average MT (seconds) were recorded. Condition significantly (p = .000) impacted overall RTs, as the texting condition (0.47 s) took longer than the no-texting (control) condition (0.34 s). Longer MTs were also recorded during the texting condition (0.28 s) than in the control condition (0.23 s), p = .001. Overall increases in Response Time (RT + MT) of 189 ms during the texting condition would equate to an additional 4.2 meters (to react to the stimulus and begin braking) if the participant had been driving an automobile at 80 km per hour. In conclusion, increasing task complexity due to the dual-task demand of text messaging during simulated driving caused significant increases in RT (41%), MT (23%) and Response Time (34%), thus further strengthening the mounting evidence against text messaging while driving.

Keywords: simulated driving, text messaging, reaction time, movement time

Procedia PDF Downloads 516
1029 Low-Complex, High-Fidelity Two-Grades Cyclo-Olefin Copolymer (COC) Based Thermal Bonding Technique for Sealing a Thermoplastic Microfluidic Biosensor

Authors: Jorge Prada, Christina Cordes, Carsten Harms, Walter Lang

Abstract:

The development of microfluidic-based biosensors over the last years has shown an increasing employ of thermoplastic polymers as constitutive material. Their low-cost production, high replication fidelity, biocompatibility and optical-mechanical properties are sought after for the implementation of disposable albeit functional lab-on-chip solutions. Among the range of thermoplastic materials on use, the Cyclo-Olefin Copolymer (COC) stands out due to its optical transparency, which makes it a frequent choice as manufacturing material for fluorescence-based biosensors. Moreover, several processing techniques to complete a closed COC microfluidic biosensor have been discussed in the literature. The reported techniques differ however in their implementation, and therefore potentially add more or less complexity when using it in a mass production process. This work introduces and reports results on the application of a purely thermal bonding process between COC substrates, which were produced by the hot-embossing process, and COC foils containing screen-printed circuits. The proposed procedure takes advantage of the transition temperature difference between two COC grades foils to accomplish the sealing of the microfluidic channels. Patterned heat injection to the COC foil through the COC substrate is applied, resulting in consistent channel geometry uniformity. Measurements on bond strength and bursting pressure are shown, suggesting that this purely thermal bonding process potentially renders a technique which can be easily adapted into the thermoplastic microfluidic chip production workflow, while enables a low-cost as well as high-quality COC biosensor manufacturing process.

Keywords: biosensor, cyclo-olefin copolymer, hot embossing, thermal bonding, thermoplastics

Procedia PDF Downloads 231
1028 Maintaining Experimental Consistency in Geomechanical Studies of Methane Hydrate Bearing Soils

Authors: Lior Rake, Shmulik Pinkert

Abstract:

Methane hydrate has been found in significant quantities in soils offshore within continental margins and in permafrost within arctic regions where low temperature and high pressure are present. The mechanical parameters for geotechnical engineering are commonly evaluated in geomechanical laboratories adapted to simulate the environmental conditions of methane hydrate-bearing sediments (MHBS). Due to the complexity and high cost of natural MHBS sampling, most laboratory investigations are conducted on artificially formed samples. MHBS artificial samples can be formed using different hydrate formation methods in the laboratory, where methane gas and water are supplied into the soil pore space under the methane hydrate phase conditions. The most commonly used formation method is the excess gas method which is considered a relatively simple, time-saving, and repeatable testing method. However, there are several differences in the procedures and techniques used to produce the hydrate using the excess gas method. As a result of the difference between the test facilities and the experimental approaches that were carried out in previous studies, different measurement criteria and analyses were proposed for MHBS geomechanics. The lack of uniformity among the various experimental investigations may adversely impact the reliability of integrating different data sets for unified mechanical model development. In this work, we address some fundamental aspects relevant to reliable MHBS geomechanical investigations, such as hydrate homogeneity in the sample, the hydrate formation duration criterion, the hydrate-saturation evaluation method, and the effect of temperature measurement accuracy. Finally, a set of recommendations for repeatable and reliable MHBS formation will be suggested for future standardization of MHBS geomechanical investigation.

Keywords: experimental study, laboratory investigation, excess gas, hydrate formation, standardization, methane hydrate-bearing sediment

Procedia PDF Downloads 46
1027 A Cloud-Based Mobile Auditing Tools for Muslim-Friendly Hospitality Services

Authors: Mohd Iskandar Illyas Tan, Zuhra Junaida Mohamad Husny, Farawahida Mohd Yusof

Abstract:

The potentials of Muslim-friendly hospitality services bring huge opportunities to operators (hoteliers, tourist guides, and travel agents), especially among the Muslim countries. In order to provide guidelines that facilitate the operations among these operators, standards and manuals have been developing by the authorities. Among the challenges is the applicability and complexity of the standard to be adopted in the real world. Mobile digital technology can be implemented to overcome those challenges. A prototype has been developed to help operators and authorities to assess their readiness in complying with MS2610:2015. This study analyzes the of mobile digital technology characteristics that are suitable for the user in conducting sharia’ compliant hospitality audit. A focus group study was conducted in the state of Penang, Malaysia that involves operators (hoteliers, tourist guide, and travel agents) as well as agencies (Islamic Tourism Center, Penang Islamic Affairs Department, Malaysian Standard) that involved directly in the implementation of the certification. Both groups were given the 3 weeks to test and provide feedback on the usability of the mobile applications in order to conduct an audit on their readiness towards the Muslim-friendly hospitality services standard developed by the Malaysian Standard. The feedbacks were analyzed and the overall results show that three criteria (ease of use, completeness and fast to complete) show the highest responses among both groups for the mobile application. This study provides the evidence that the mobile application development has huge potentials to be implemented by the Muslim-friendly hospitality services operator and agencies.

Keywords: hospitality, innovation, audit, compliance, mobile application

Procedia PDF Downloads 124
1026 Evolution of Approaches to Cost Calculation in the Conditions of the Modern Russian Economy

Authors: Elena Tkachenko, Vladimir Kokh, Alina Osipenko, Vladislav Surkov

Abstract:

The modern period of development of Russian economy is fraught with a number of problems related to limitations in the use of traditional planning and financial management tools. Restrictions in the use of foreign software when performing an order of the Russian Government, on the one hand, and sanctions limiting the support of the major ERP and MRP II systems in the Russian Federation, on the other hand, entail the necessity to appeal to the basics of developing budgeting and analysis systems for industrial enterprises. Thus, cost calculation theory becomes the theoretical foundation for the development of industrial cost management systems. Based on the foregoing, it would be fair to make an assumption that the development of a working managerial accounting model on an industrial enterprise using an automated enterprise resource management system should rest upon the concept of the inevitability of alterations of business processes. On the other hand, optimized business processes make the architecture of financial analytics more transparent and permit the use of all the benefits of data cubes. The metrics and indicator slices provide online assessment of the state of key business processes at a given moment of time, which improves the quality of managerial decisions considerably. Therefore, the bilateral sanctions situation boosted the development of corporate business analytics and took industrial companies to the next level of understanding of business processes.

Keywords: cost culculation, ERP, OLAP, modern Russian economy

Procedia PDF Downloads 209
1025 Video Object Segmentation for Automatic Image Annotation of Ethernet Connectors with Environment Mapping and 3D Projection

Authors: Marrone Silverio Melo Dantas Pedro Henrique Dreyer, Gabriel Fonseca Reis de Souza, Daniel Bezerra, Ricardo Souza, Silvia Lins, Judith Kelner, Djamel Fawzi Hadj Sadok

Abstract:

The creation of a dataset is time-consuming and often discourages researchers from pursuing their goals. To overcome this problem, we present and discuss two solutions adopted for the automation of this process. Both optimize valuable user time and resources and support video object segmentation with object tracking and 3D projection. In our scenario, we acquire images from a moving robotic arm and, for each approach, generate distinct annotated datasets. We evaluated the precision of the annotations by comparing these with a manually annotated dataset, as well as the efficiency in the context of detection and classification problems. For detection support, we used YOLO and obtained for the projection dataset an F1-Score, accuracy, and mAP values of 0.846, 0.924, and 0.875, respectively. Concerning the tracking dataset, we achieved an F1-Score of 0.861, an accuracy of 0.932, whereas mAP reached 0.894. In order to evaluate the quality of the annotated images used for classification problems, we employed deep learning architectures. We adopted metrics accuracy and F1-Score, for VGG, DenseNet, MobileNet, Inception, and ResNet. The VGG architecture outperformed the others for both projection and tracking datasets. It reached an accuracy and F1-score of 0.997 and 0.993, respectively. Similarly, for the tracking dataset, it achieved an accuracy of 0.991 and an F1-Score of 0.981.

Keywords: RJ45, automatic annotation, object tracking, 3D projection

Procedia PDF Downloads 153
1024 Service Interactions Coordination Using a Declarative Approach: Focuses on Deontic Rule from Semantics of Business Vocabulary and Rules Models

Authors: Nurulhuda A. Manaf, Nor Najihah Zainal Abidin, Nur Amalina Jamaludin

Abstract:

Coordinating service interactions are a vital part of developing distributed applications that are built up as networks of autonomous participants, e.g., software components, web services, online resources, involve a collaboration between a diverse number of participant services on different providers. The complexity in coordinating service interactions reflects how important the techniques and approaches require for designing and coordinating the interaction between participant services to ensure the overall goal of a collaboration between participant services is achieved. The objective of this research is to develop capability of steering a complex service interaction towards a desired outcome. Therefore, an efficient technique for modelling, generating, and verifying the coordination of service interactions is developed. The developed model describes service interactions using service choreographies approach and focusing on a declarative approach, advocating an Object Management Group (OMG) standard, Semantics of Business Vocabulary and Rules (SBVR). This model, namely, SBVR model for service choreographies focuses on a declarative deontic rule expressing both obligation and prohibition, which can be more useful in working with coordinating service interactions. The generated SBVR model is then be formulated and be transformed into Alloy model using Alloy Analyzer for verifying the generated SBVR model. The transformation of SBVR into Alloy allows to automatically generate the corresponding coordination of service interactions (service choreography), hence producing an immediate instance of execution that satisfies the constraints of the specification and verifies whether a specific request can be realised in the given choreography in the generated choreography.

Keywords: service choreography, service coordination, behavioural modelling, complex interactions, declarative specification, verification, model transformation, semantics of business vocabulary and rules, SBVR

Procedia PDF Downloads 137
1023 Design and Implementation of Agricultural Machinery Equipment Scheduling Platform Based On Case-Based Reasoning

Authors: Wen Li, Zhengyu Bai, Qi Zhang

Abstract:

The demand for smart scheduling platform in agriculture, particularly in the scheduling process of machinery equipment, is high. With the continuous development of agricultural machinery equipment technology, a large number of agricultural machinery equipment and agricultural machinery cooperative service organizations continue to appear in China. The large area of cultivated land and a large number of agricultural activities in the central and western regions of China have made the demand for smart and efficient agricultural machinery equipment scheduling platforms more intense. In this study, we design and implement a platform for agricultural machinery equipment scheduling to allocate agricultural machinery equipment resources reasonably. With agricultural machinery equipment scheduling platform taken as the research object, we discuss its research significance and value, use the service blueprint technology to analyze and characterize the agricultural machinery equipment schedule workflow, the network analytic method to obtain the demand platform function requirements, and divide the platform functions through the platform function division diagram. Simultaneously, based on the case-based reasoning (CBR) algorithm, the equipment scheduling module of the agricultural machinery equipment scheduling platform is realized; finally, a design scheme of the agricultural machinery equipment scheduling platform architecture is provided, and the visualization interface of the platform is established via VB programming language. It provides design ideas and theoretical support for the construction of a modern agricultural equipment information scheduling platform.

Keywords: case-based reasoning, service blueprint, system design, ANP, VB programming language

Procedia PDF Downloads 161
1022 A Comparative Analysis of Carbon Footprints of Households in Different Housing Types and Seasons

Authors: Taehyun Kim

Abstract:

As a result of rapid urbanization, energy demands for lighting, heating and cooling of households have been concentrated in metropolitan areas. The energy resources for housing in urban areas are dominantly fossil fuel whose uses contribute to increase cost of living and carbon dioxide (CO2) emission. To achieve environmentally and economically sustainable residential development, it is important to know how energy use and cost of living can be reduced by planning and design. The purpose of this study is to examine which type of building requires less energy for housing. To do so, carbon footprint (CF) quiz survey was employed which estimates the amount of carbon dioxide required to support households’ consumption of energy uses for housing. The housing carbon footprints (HCF) of 500 households of Seoul, Korea in summer and winter were estimated and compared in three major types of housing: single-family (detached), row-house and apartment. In addition, its differences of HCF were estimated between tower and flat type of apartment. The results of T-test and analysis of variance (ANOVA) provide statistical evidence that housing type is related to housing energy use. Average HCF of detached house was higher than other housing types. Between two types of apartment, tower type shows higher HCF than flat type in winter. These findings may provide new perspectives on CF application in sustainable architecture and urban design.

Keywords: analysis of variance, carbon footprint, energy use, housing type

Procedia PDF Downloads 490
1021 The Effect of Principled Human Resource Management and Training Based on Existing Standards in Order to Improve the Quality of Construction Projects

Authors: Arsalan Salahi

Abstract:

Today, the number of changes in the construction industry and urban mass house building is increasing, which makes you need to pay more attention to targeted planning for human resource management and training. The human resources working in the construction industry have various problems and deficiencies, and in order to solve these problems, there is a need for basic management and training of these people in order to lower the construction costs and increase the quality of the projects, especially in mass house building projects. The success of any project in reaching short and long-term professional goals depends on the efficient combination of work tools, financial resources, raw materials, and most importantly, human resources. Today, due to the complexity and diversity of each project, specialized management fields have emerged to maximize the potential benefits of each component of that project. Human power is known as the most important resource in construction projects for its successful implementation, but unfortunately, due to the low cost of human power compared to other resources, such as materials and machinery, little attention is paid to it. With the correct management and training of human resources, which depends on its correct planning and development, it is possible to improve the performance of construction projects. In this article, the training and motivation of construction industry workers and their effects on the effectiveness of projects in this industry have been researched. In this regard, some barriers to the training and motivation of construction workers and personnel have been identified and solutions have been provided for construction companies. Also, the impact of workers and unskilled people on the efficiency of construction projects is investigated. The results of the above research show that by increasing the use of correct and basic training for human resources, we will see positive results and effects on the performance of construction projects.

Keywords: human resources, construction industry, principled training, skilled and unskilled workers

Procedia PDF Downloads 75
1020 Ultrathin NaA Zeolite Membrane in Solvent Recovery: Preparation and Application

Authors: Eng Toon Saw, Kun Liang Ang, Wei He, Xuecheng Dong, Seeram Ramakrishna

Abstract:

Solvent recovery process is receiving utmost attention in recent year due to the scarcity of natural resource and consciousness of circular economy in chemical and pharmaceutical manufacturing process. Solvent dehydration process is one of the important process to recover and to purify the solvent for reuse. Due to the complexity of solvent waste or wastewater effluent produced in pharmaceutical industry resulting the wastewater treatment process become complicated, thus an alternative solution is to recover the valuable solvent in solvent waste. To treat solvent waste and to upgrade solvent purity, membrane pervaporation process is shown to be a promising technology due to the energy intensive and low footprint advantages. Ceramic membrane is adopted as solvent dehydration membrane owing to the chemical and thermal stability properties as compared to polymeric membrane. NaA zeolite membrane is generally used as solvent dehydration process because of its narrow and distinct pore size and high hydrophilicity. NaA zeolite membrane has been mainly applied in alcohol dehydration in fermentation process. At this stage, the membrane performance exhibits high separation factor with low flux using tubular ceramic membrane. Thus, defect free and ultrathin NaA membrane should be developed to increase water flux. Herein, we report a simple preparation protocol to prepare ultrathin NaA zeolite membrane supported on tubular ceramic membrane by controlling the seed size synthesis, seeding methods and conditions, ceramic substrate surface pore size selection and secondary growth conditions. The microstructure and morphology of NaA zeolite membrane will be examined and reported. Moreover, the membrane separation performance and stability will also be reported in isopropanol dehydration, ketone dehydration and ester dehydration particularly for the application in pharmaceutical industry.

Keywords: ceramic membrane, NaA zeolite, pharmaceutical industry, solvent recovery

Procedia PDF Downloads 234
1019 Clinical and Sleep Features in an Australian Population Diagnosed with Mild Cognitive Impairment

Authors: Sadie Khorramnia, Asha Bonney, Kate Galloway, Andrew Kyoong

Abstract:

Sleep plays a pivotal role in the registration and consolidation of memory. Multiple observational studies have demonstrated that self-reported sleep duration and sleep quality are associated with cognitive performance. Montreal Cognitive Assessment questionnaire is a screening tool to assess mild cognitive (MCI) impairment with a 90% diagnostic sensitivity. In our current study, we used MOCA to identify MCI in patients who underwent sleep study in our sleep department. We then looked at the clinical risk factors and sleep-related parameters in subjects found to have mild cognitive impairment but without a diagnosis of sleep-disordered breathing. Clinical risk factors, including physician, diagnosed hypertension, diabetes, and depression and sleep-related parameters, measured during sleep study, including percentage time of each sleep stage, total sleep time, awakenings, sleep efficiency, apnoea hypopnoea index, and oxygen saturation, were evaluated. A total of 90 subjects who underwent sleep study between March 2019 and October 2019 were included. Currently, there is no pharmacotherapy available for MCI; therefore, identifying the risk factors and attempting to reverse or mitigate their effect is pivotal in slowing down the rate of cognitive deterioration. Further characterization of sleep parameters in this group of patients could open up opportunities for potentially beneficial interventions.

Keywords: apnoea hypopnea index, mild cognitive impairment, sleep architecture, sleep study

Procedia PDF Downloads 134
1018 Digital Design and Fabrication: A Review of Trend and Its Impact in the African Context

Authors: Mohamed Al Araby, Amany Salman, Mostafa Amin, Mohamed Madbully, Dalia Keraa, Mariam Ali, Marah Abdelfatah, Mariam Ahmed, Ahmed Hassab

Abstract:

In recent years, the architecture, engineering, and construction (A.E.C.) industry have been exposed to important innovations, most notably the global integration of digital design and fabrication (D.D.F.) processes in the industry’s workflow. Despite this evolution in that sector, Africa was excluded from the examination of this development. The reason behind this exclusion is the preconceived view of it as a developing region that still employs traditional methods of construction. The primary objective of this review is to investigate the trend of digital construction (D.C.) in the African environment and the difficulties in its regular utilization of it. This objective can be attained by recognizing the notion of distributed computing in Africa and evaluating the impact of the projects deploying this technology on both the immediate and broader contexts. The paper’s methodology begins with the collection of data from 224 initiatives throughout Africa. Then, 50 of these projects were selected based on the criteria of the project's recency, typology variety, and location diversity. After that, a literature-based comparative analysis was undertaken. This study’s findings reveal a pattern of motivation for applying digital fabrication processes. Moreover, it is essential to evaluate the socio-economic effects of these projects on the population living near the analyzed subject. The last step in this study is identifying the influence on the neighboring nations.

Keywords: Africa, digital construction, digital design, fabrication

Procedia PDF Downloads 151
1017 Water Re-Use Optimization in a Sugar Platform Biorefinery Using Municipal Solid Waste

Authors: Leo Paul Vaurs, Sonia Heaven, Charles Banks

Abstract:

Municipal solid waste (MSW) is a virtually unlimited source of lignocellulosic material in the form of a waste paper/cardboard mixture which can be converted into fermentable sugars via cellulolytic enzyme hydrolysis in a biorefinery. The extraction of the lignocellulosic fraction and its preparation, however, are energy and water demanding processes. The waste water generated is a rich organic liquor with a high Chemical Oxygen Demand that can be partially cleaned while generating biogas in an Upflow Anaerobic Sludge Blanket bioreactor and be further re-used in the process. In this work, an experiment was designed to determine the critical contaminant concentrations in water affecting either anaerobic digestion or enzymatic hydrolysis by simulating multiple water re-circulations. It was found that re-using more than 16.5 times the same water could decrease the hydrolysis yield by up to 65 % and led to a complete granules desegregation. Due to the complexity of the water stream, the contaminant(s) responsible for the performance decrease could not be identified but it was suspected to be caused by sodium, potassium, lipid accumulation for the anaerobic digestion (AD) process and heavy metal build-up for enzymatic hydrolysis. The experimental data were incorporated into a Water Pinch technology based model that was used to optimize the water re-utilization in the modelled system to reduce fresh water requirement and wastewater generation while ensuring all processes performed at optimal level. Multiple scenarios were modelled in which sub-process requirements were evaluated in term of importance, operational costs and impact on the CAPEX. The best compromise between water usage, AD and enzymatic hydrolysis yield was determined for each assumed contaminant degradations by anaerobic granules. Results from the model will be used to build the first MSW based biorefinery in the USA.

Keywords: anaerobic digestion, enzymatic hydrolysis, municipal solid waste, water optimization

Procedia PDF Downloads 308
1016 Artificial Intelligence Approach to Water Treatment Processes: Case Study of Daspoort Treatment Plant, South Africa

Authors: Olumuyiwa Ojo, Masengo Ilunga

Abstract:

Artificial neural network (ANN) has broken the bounds of the convention programming, which is actually a function of garbage in garbage out by its ability to mimic the human brain. Its ability to adopt, adapt, adjust, evaluate, learn and recognize the relationship, behavior, and pattern of a series of data set administered to it, is tailored after the human reasoning and learning mechanism. Thus, the study aimed at modeling wastewater treatment process in order to accurately diagnose water control problems for effective treatment. For this study, a stage ANN model development and evaluation methodology were employed. The source data analysis stage involved a statistical analysis of the data used in modeling in the model development stage, candidate ANN architecture development and then evaluated using a historical data set. The model was developed using historical data obtained from Daspoort Wastewater Treatment plant South Africa. The resultant designed dimensions and model for wastewater treatment plant provided good results. Parameters considered were temperature, pH value, colour, turbidity, amount of solids and acidity. Others are total hardness, Ca hardness, Mg hardness, and chloride. This enables the ANN to handle and represent more complex problems that conventional programming is incapable of performing.

Keywords: ANN, artificial neural network, wastewater treatment, model, development

Procedia PDF Downloads 140
1015 An Analysis on Clustering Based Gene Selection and Classification for Gene Expression Data

Authors: K. Sathishkumar, V. Thiagarasu

Abstract:

Due to recent advances in DNA microarray technology, it is now feasible to obtain gene expression profiles of tissue samples at relatively low costs. Many scientists around the world use the advantage of this gene profiling to characterize complex biological circumstances and diseases. Microarray techniques that are used in genome-wide gene expression and genome mutation analysis help scientists and physicians in understanding of the pathophysiological mechanisms, in diagnoses and prognoses, and choosing treatment plans. DNA microarray technology has now made it possible to simultaneously monitor the expression levels of thousands of genes during important biological processes and across collections of related samples. Elucidating the patterns hidden in gene expression data offers a tremendous opportunity for an enhanced understanding of functional genomics. However, the large number of genes and the complexity of biological networks greatly increase the challenges of comprehending and interpreting the resulting mass of data, which often consists of millions of measurements. A first step toward addressing this challenge is the use of clustering techniques, which is essential in the data mining process to reveal natural structures and identify interesting patterns in the underlying data. This work presents an analysis of several clustering algorithms proposed to deals with the gene expression data effectively. The existing clustering algorithms like Support Vector Machine (SVM), K-means algorithm and evolutionary algorithm etc. are analyzed thoroughly to identify the advantages and limitations. The performance evaluation of the existing algorithms is carried out to determine the best approach. In order to improve the classification performance of the best approach in terms of Accuracy, Convergence Behavior and processing time, a hybrid clustering based optimization approach has been proposed.

Keywords: microarray technology, gene expression data, clustering, gene Selection

Procedia PDF Downloads 313
1014 Schedule Risk Management for Complex Projects: The Royal Research Ship: Sir David Attenborough Case Study

Authors: Chatelier Charlene, Oyegoke Adekunle, Ajayi Saheed, Jeffries Andrew

Abstract:

This study seeks to understand Schedule Risk Assessments as a priori for better performance whilst exploring the strategies employed to deliver complex projects like the New Polar research ship. This high-profile vessel was offered to Natural Environment Research Council and British Antarctic Survey (BAS) by Cammell Laird Shipbuilders. The Research Ship was designed to support science in extreme environments, with the expectancy to provide a wide range of specialist scientific facilities, instruments, and laboratories to conduct research over multiple disciplines. Aim: The focus is to understand the allocation and management of schedule risk on such a Major Project. Hypothesising that "effective management of schedule risk management" could be the most critical factor in determining whether the intended benefits mentioned are delivered within time and cost constraints. Objective 1: Firstly, the study seeks to understand the allocation and management of schedule risk in Major Projects. Objective 2: Secondly, it explores "effective management of schedule risk management" as the most critical factor determining the delivery of intended benefits. Methodology: This study takes a retrospective review of schedule risk management and how it influences project performance using a case study approach for the RRS (Royal Research Ship) Sir David Attenborough. Research Contribution: The outcomes of this study will contribute to a better understanding of project performance whilst building on its under-researched relationship to schedule risk management for complex projects. The outcomes of this paper will guide further research on project performance and enable the understanding of how risk-based estimates over time impact the overall risk management of the project.

Keywords: complexity, major projects, performance management, schedule risk management, uncertainty

Procedia PDF Downloads 84
1013 Ectopic Pregnancy: A Case of Consecutive Occurrences of Different Types

Authors: Wania Mohammad Akram, Swetha Kannan, Urooj Shahid, Aisha Sajjad

Abstract:

Ovarian ectopic pregnancy, a rare manifestation of ectopic gestation, involves the implantation of a fertilized egg on the ovarian surface. This condition poses diagnostic challenges and is associated with significant maternal morbidity if not promptly managed. This report presents the case of a 33-year-old nulliparous woman with a history of polycystic ovary syndrome (PCOS) undergoing ovulation induction therapy. Following her first conception in October 2021, she presented with symptoms of per vaginal spotting and low back pain, prompting a diagnosis of left adnexal ectopic pregnancy confirmed by transvaginal ultrasound and serum beta-human chorionic gonadotropin (B-HCG) levels. Medical management with methotrexate was initiated successfully. In August 2022, the patient conceived again, with subsequent ultrasound revealing a large pelvic collection suggestive of a complex ectopic pregnancy involving both ovaries. Despite initial stability, she developed abdominal pain necessitating emergency laparoscopy, which revealed an ovarian ectopic pregnancy with hemoperitoneum. Laparotomy was performed due to the complexity of the presentation, and histopathology confirmed viable chorionic villi within ovarian tissue. This case underscores the clinical management challenges posed by ovarian ectopic pregnancies, particularly in patients with previous ectopic pregnancies. The discussion reviews current literature on diagnostic modalities, treatment strategies, and outcomes associated with ovarian ectopic pregnancies, emphasizing the role of surgical intervention in cases refractory to conservative management. Tailored approaches considering individual patient factors are crucial to optimize outcomes and preserve fertility in such complex scenarios.

Keywords: obgyn, ovarian ectopic pregnancy, laproscopy, pcos

Procedia PDF Downloads 15
1012 Examining the Relationship between Concussion and Neurodegenerative Disorders: A Review on Amyotrophic Lateral Sclerosis and Alzheimer’s Disease

Authors: Edward Poluyi, Eghosa Morgan, Charles Poluyi, Chibuikem Ikwuegbuenyi, Grace Imaguezegie

Abstract:

Background: Current epidemiological studies have examined the associations between moderate and severe traumatic brain injury (TBI) and their risks of developing neurodegenerative diseases. Concussion, also known as mild TBI (mTBI), is however quite distinct from moderate or severe TBIs. Only few studies in this burgeoning area have examined concussion—especially repetitive episodes—and neurodegenerative diseases. Thus, no definite relationship has been established between them. Objectives : This review will discuss the available literature linking concussion and amyotrophic lateral sclerosis (ALS) and Alzheimer’s disease (AD). Materials and Methods: Given the complexity of this subject, a realistic review methodology was selected which includes clarifying the scope and developing a theoretical framework, developing a search strategy, selection and appraisal, data extraction, and synthesis. A detailed literature matrix was set out in order to get relevant and recent findings on this topic. Results: Presently, there is no objective clinical test for the diagnosis of concussion because the features are less obvious on physical examination. Absence of an objective test in diagnosing concussion sometimes leads to skepticism when confirming the presence or absence of concussion. Intriguingly, several possible explanations have been proposed in the pathological mechanisms that lead to the development of some neurodegenerative disorders (such as ALS and AD) and concussion but the two major events are deposition of tau proteins (abnormal microtubule proteins) and neuroinflammation, which ranges from glutamate excitotoxicity pathways and inflammatory pathways (which leads to a rise in the metabolic demands of microglia cells and neurons), to mitochondrial function via the oxidative pathways.

Keywords: amyotrophic lateral sclerosis, Alzheimer's disease, mild traumatic brain injury, neurodegeneration

Procedia PDF Downloads 79
1011 Radiation Risks for Nurses: The Unrecognized Consequences of ERCP Procedures

Authors: Ava Zarif Sanayei, Sedigheh Sina

Abstract:

Despite the advancement of radiation-free interventions in the gastrointestinal and hepatobiliary fields, endoscopy and endoscopic retrograde cholangiopancreatography (ERCP) remain indispensable procedures that necessitate radiation exposure. ERCP, in particular, relies heavily on radiation-guided imaging to ensure precise delivery of therapy. Meanwhile, interventional radiology (IR) procedures also utilize imaging modalities like X-rays and CT scans to guide therapy, often under local anesthesia via small needle insertion. However, the complexity of these procedures raises concerns about radiation exposure to healthcare professionals, including nurses, who play a crucial role in these interventions. This study aims to assess the radiation exposure to the hands and fingers of nurses 1 and 2, who are directly involved in ERCP procedures utilizing (TLD-100) dosimeters at the Gastrointestinal Endoscopy department of a clinic in Shiraz, Iran. The dosimeters were initially calibrated using various phantoms and then a group was prepared and used over a two-month period. For personal equivalent dose measurement, two TLD chips were mounted on a finger ring to monitor exposure to the hands and fingers. Upon completion of the monitoring period, the TLDs were analyzed using a TLD reader, showing that Nurse 1 received an equivalent dose of 298.26 µSv and Nurse 2 received an equivalent dose of 195.39 µSv. The investigation revealed that the total radiation exposure to the nurses did not exceed the annual limit for occupational exposure. Nevertheless, it is essential to prioritize radiation protection measures to prevent potential harm. The study showed that positioning staff members and placing two nurses in a specific location contributed to somehow equal doses. To reduce exposure further, we suggest providing education and training on radiation safety principles, particularly for technologists.

Keywords: dose measurement, ERCP, interventional radiology, medical imaging

Procedia PDF Downloads 18
1010 A Compact Via-less Ultra-Wideband Microstrip Filter by Utilizing Open-Circuit Quarter Wavelength Stubs

Authors: Muhammad Yasir Wadood, Fatemeh Babaeian

Abstract:

By developing ultra-wideband (UWB) systems, there is a high demand for UWB filters with low insertion loss, wide bandwidth, and having a planar structure which is compatible with other components of the UWB system. A microstrip interdigital filter is a great option for designing UWB filters. However, the presence of via holes in this structure creates difficulties in the fabrication procedure of the filter. Especially in the higher frequency band, any misalignment of the drilled via hole with the Microstrip stubs causes large errors in the measurement results compared to the desired results. Moreover, in this case (high-frequency designs), the line width of the stubs are very narrow, so highly precise small via holes are required to be implemented, which increases the cost of fabrication significantly. Also, in this case, there is a risk of having fabrication errors. To combat this issue, in this paper, a via-less UWB microstrip filter is proposed which is designed based on a modification of a conventional inter-digital bandpass filter. The novel approaches in this filter design are 1) replacement of each via hole with a quarter-wavelength open circuit stub to avoid the complexity of manufacturing, 2) using a bend structure to reduce the unwanted coupling effects and 3) minimising the size. Using the proposed structure, a UWB filter operating in the frequency band of 3.9-6.6 GHz (1-dB bandwidth) is designed and fabricated. The promising results of the simulation and measurement are presented in this paper. The selected substrate for these designs was Rogers RO4003 with a thickness of 20 mils. This is a common substrate in most of the industrial projects. The compact size of the proposed filter is highly beneficial for applications which require a very miniature size of hardware.

Keywords: band-pass filters, inter-digital filter, microstrip, via-less

Procedia PDF Downloads 144
1009 Improving Fake News Detection Using K-means and Support Vector Machine Approaches

Authors: Kasra Majbouri Yazdi, Adel Majbouri Yazdi, Saeid Khodayi, Jingyu Hou, Wanlei Zhou, Saeed Saedy

Abstract:

Fake news and false information are big challenges of all types of media, especially social media. There is a lot of false information, fake likes, views and duplicated accounts as big social networks such as Facebook and Twitter admitted. Most information appearing on social media is doubtful and in some cases misleading. They need to be detected as soon as possible to avoid a negative impact on society. The dimensions of the fake news datasets are growing rapidly, so to obtain a better result of detecting false information with less computation time and complexity, the dimensions need to be reduced. One of the best techniques of reducing data size is using feature selection method. The aim of this technique is to choose a feature subset from the original set to improve the classification performance. In this paper, a feature selection method is proposed with the integration of K-means clustering and Support Vector Machine (SVM) approaches which work in four steps. First, the similarities between all features are calculated. Then, features are divided into several clusters. Next, the final feature set is selected from all clusters, and finally, fake news is classified based on the final feature subset using the SVM method. The proposed method was evaluated by comparing its performance with other state-of-the-art methods on several specific benchmark datasets and the outcome showed a better classification of false information for our work. The detection performance was improved in two aspects. On the one hand, the detection runtime process decreased, and on the other hand, the classification accuracy increased because of the elimination of redundant features and the reduction of datasets dimensions.

Keywords: clustering, fake news detection, feature selection, machine learning, social media, support vector machine

Procedia PDF Downloads 166
1008 Adaptive Certificate-Based Mutual Authentication Protocol for Mobile Grid Infrastructure

Authors: H. Parveen Begam, M. A. Maluk Mohamed

Abstract:

Mobile Grid Computing is an environment that allows sharing and coordinated use of diverse resources in dynamic, heterogeneous and distributed environment using different types of electronic portable devices. In a grid environment the security issues are like authentication, authorization, message protection and delegation handled by GSI (Grid Security Infrastructure). Proving better security between mobile devices and grid infrastructure is a major issue, because of the open nature of wireless networks, heterogeneous and distributed environments. In a mobile grid environment, the individual computing devices may be resource-limited in isolation, as an aggregated sum, they have the potential to play a vital role within the mobile grid environment. Some adaptive methodology or solution is needed to solve the issues like authentication of a base station, security of information flowing between a mobile user and a base station, prevention of attacks within a base station, hand-over of authentication information, communication cost of establishing a session key between mobile user and base station, computing complexity of achieving authenticity and security. The sharing of resources of the devices can be achieved only through the trusted relationships between the mobile hosts (MHs). Before accessing the grid service, the mobile devices should be proven authentic. This paper proposes the dynamic certificate based mutual authentication protocol between two mobile hosts in a mobile grid environment. The certificate generation process is done by CA (Certificate Authority) for all the authenticated MHs. Security (because of validity period of the certificate) and dynamicity (transmission time) can be achieved through the secure service certificates. Authentication protocol is built on communication services to provide cryptographically secured mechanisms for verifying the identity of users and resources.

Keywords: mobile grid computing, certificate authority (CA), SSL/TLS protocol, secured service certificates

Procedia PDF Downloads 295
1007 Compliance of Systematic Reviews in Ophthalmology with the PRISMA Statement

Authors: Seon-Young Lee, Harkiran Sagoo, Reem Farwana, Katharine Whitehurst, Alex Fowler, Riaz Agha

Abstract:

Background/Aims: Systematic reviews and meta-analysis are becoming increasingly important way of summarizing research evidence. Researches in ophthalmology may represent further challenges, due to their potential complexity in study design. The aim of our study was to determine the reporting quality of systematic reviews and meta-analysis in ophthalmology with the PRISMA statement, by assessing the articles published between 2010 and 2015 from five major journals with the highest impact factor. Methods: MEDLINE and EMBASE were used to search systematic reviews published between January 2010 and December 2015, in 5 major ophthalmology journals: Progress in Retinal and Eye Research, Ophthalmology, Archives of Ophthalmology, American Journal of Ophthalmology, Journal of the American Optometric Association. Screening, identification, and scoring of articles were performed independently by two teams, followed by statistical analysis including the median, range, and 95% CIs. Results: 115 articles were involved. The median PRISMA score was 15 of 27 items (56%), with a range of 5-26 (19-96%) and 95% CI 13.9-16.1 (51-60%). Compliance was highest in items related to the description of rationale (item 3,100%) and inclusion of a structured summary in the abstract (item 2, 90%), while poorest in indication of review protocol and registration (item 5, 9%), specification of risk of bias affecting the cumulative evidence (item 15, 24%) and description of clear objectives in introduction (item 4, 26%). Conclusion: The reporting quality of systematic reviews and meta-analysis in ophthalmology need significant improvement. While the use of PRISMA criteria as a guideline before journal submission is recommended, additional research identifying potential barriers may be required to improve the compliance to the PRISMA guidelines.

Keywords: systematic reviews, meta-analysis, research methodology, reporting quality, PRISMA, ophthalmology

Procedia PDF Downloads 255
1006 The Regional Novel in India: Its Emergence and Trajectory

Authors: Aruna Bommareddi

Abstract:

The journey of the novel is well examined in Indian academia as an offshoot of the novel in English. There have been many attempts to understand aspects of the early novel in India which shared a commonality with the English novel. The regional novel has had an entirely different trajectory which is mapped in the paper. The main focus of the paper would be to look at the historical emergence of the genre of the regional novel in Indian Literatures with specific reference to Kannada, Hindi, and Bengali. The selection of these languages is guided not only by familiarity with these languages as also based on the significance that these languages enjoy in the sub-continent and for the emergence of the regional novel as a specific category in these languages. The regional novels under study are Phaneeswaranath Renu’s Maila Anchal, Tarashankar Bandopadhyaya’s Ganadevata, and Kuvempu’s House of Kanuru for exploration of the themes of its emergence and some aspects of the regional novel common to and different from each other. The paper would explore the various movements that have shaped the genre regional novel in these Literatures. Though Phaneeswarnath Renu’s Maila Anchal is published in 1956, the novel is set in pre-Independent India and therefore shares a commonality of themes with the other two novels, House of Kanuru and Ganadevata. All three novels explore themes of superstition, ignorance, poverty, and the interventions of educated youth to salvage the crises in these backward regional worlds. In fact, it was Renu who assertively declared that he was going to write a regional novel and hence the tile of the first regional novel in Hindi is Maila Anchal meaning the soiled border. In Hindi, anchal also means the region therefore, the title is suggestive of a dirty region as well. The novel exposes the squalor, ignorance, and the conflict ridden life of the village or region as opposed to the rosy image of the village in literature. With this, all such novels which depicted conflicts of the region got recognized as regional novels even though they may have been written prior to Renu’s declaration. All three novels under study succeed in bringing out the complexity of rural life at a given point of time in its history.

Keywords: bengali, hindi, kannada, regional novel, telugu

Procedia PDF Downloads 72
1005 Strategic Leadership and Sustainable Project Management in Enugu, Nigeria

Authors: Nnadi Ezekiel Ejiofor

Abstract:

In Enugu, Nigeria, this study investigates the connection between strategic leadership and project management sustainability, with an emphasis on building projects in the State. The study set out to accomplish two specific goals: first, it sought to establish a link between creative project management and resource efficiency in construction projects in Enugu State, Nigeria; and second, it sought to establish a link between innovative thinking and waste minimization in those same projects. A structured questionnaire was used to collect primary data from 45 registered construction enterprises in the study area as part of the study's descriptive research approach. Due to the nonparametric nature of the data, Spearman Rank Order Correlation was used to evaluate the acquired data. The findings demonstrate that creative project management had a significant positive impact on resource efficiency in construction projects carried out by architecture firms in Enugu State, Nigeria (r =.849; p.001), and that innovative thinking had a significant impact on waste reduction in those same projects (r =.849; p.001). It was determined that strategic leadership had a significant impact on the sustainability of project management, and it was thus advised that project managers should foresee, prepare for, and effectively communicate present and future developments to project staff in order to ensure that the objective of sustainable initiatives, such as recycling and reuse, is implemented in construction projects.

Keywords: construction, project management, strategic leadership, sustainability, waste reduction

Procedia PDF Downloads 39
1004 Climatic and Environmental Factors Affecting Human Comfort Evaluation: Case Study of Shiraz Iran

Authors: Hamid Yazdani, Fatemeh Abbasi

Abstract:

Understanding the natural potentials, as the basis for the prevailing context of human activities, environmental planning, and land-use form shows. In this regard, regional characteristics and spatial distribution of the dominant elements in shaping human behavior and environment play a role Knndhayy. As far as today's studies of human Byvklymay basis for urban planning, settlement, architecture, Tourism and so on. In this study, comfort or lack of comfort in Shiraz in Horn of models and indices based on eco-Baker, Trjvng, were examined and the best time to do-using 39 years of data (TCI) stress, and the effective temperature Environmental activities and tourism in the city was established. The results showed that the worth parameters used, the ability to detect Terms of comfort and discomfort are Shiraz, despite minor differences, relatively homogenous aspects of the city provide a comfortable climate. Studies showed that having diversity in the worth of Shiraz during the year, the situation is heating up much coolness; during winter and summer Find out eco comfort zone and during the transition from cold to warm in spring and autumn (April) and warm to cold (November) climate Iran is close to human comfort. Totally, unique human comfort conditions in spring, the best season for environmental activities Tourism in Shiraz.

Keywords: BIO comfort Klymayy, Trjvng, baker, effective temperature, stress and (TCI)

Procedia PDF Downloads 335
1003 A Novel Method for Face Detection

Authors: H. Abas Nejad, A. R. Teymoori

Abstract:

Facial expression recognition is one of the open problems in computer vision. Robust neutral face recognition in real time is a major challenge for various supervised learning based facial expression recognition methods. This is due to the fact that supervised methods cannot accommodate all appearance variability across the faces with respect to race, pose, lighting, facial biases, etc. in the limited amount of training data. Moreover, processing each and every frame to classify emotions is not required, as the user stays neutral for the majority of the time in usual applications like video chat or photo album/web browsing. Detecting neutral state at an early stage, thereby bypassing those frames from emotion classification would save the computational power. In this work, we propose a light-weight neutral vs. emotion classification engine, which acts as a preprocessor to the traditional supervised emotion classification approaches. It dynamically learns neutral appearance at Key Emotion (KE) points using a textural statistical model, constructed by a set of reference neutral frames for each user. The proposed method is made robust to various types of user head motions by accounting for affine distortions based on a textural statistical model. Robustness to dynamic shift of KE points is achieved by evaluating the similarities on a subset of neighborhood patches around each KE point using the prior information regarding the directionality of specific facial action units acting on the respective KE point. The proposed method, as a result, improves ER accuracy and simultaneously reduces the computational complexity of ER system, as validated on multiple databases.

Keywords: neutral vs. emotion classification, Constrained Local Model, procrustes analysis, Local Binary Pattern Histogram, statistical model

Procedia PDF Downloads 331