Search results for: diagnostic accuracy
755 Evaluation and Analysis of ZigBee-Based Wireless Sensor Network: Home Monitoring as Case Study
Authors: Omojokun G. Aju, Adedayo O. Sule
Abstract:
ZigBee wireless sensor and control network is one of the most popularly deployed wireless technologies in recent years. This is because ZigBee is an open standard lightweight, low-cost, low-speed, low-power protocol that allows true operability between systems. It is built on existing IEEE 802.15.4 protocol and therefore combines the IEEE 802.15.4 features and newly added features to meet required functionalities thereby finding applications in wide variety of wireless networked systems. ZigBee‘s current focus is on embedded applications of general-purpose, inexpensive, self-organising networks which requires low to medium data rates, high number of nodes and very low power consumption such as home/industrial automation, embedded sensing, medical data collection, smart lighting, safety and security sensor networks, and monitoring systems. Although the ZigBee design specification includes security features to protect data communication confidentiality and integrity, however, when simplicity and low-cost are the goals, security is normally traded-off. A lot of researches have been carried out on ZigBee technology in which emphasis has mainly been placed on ZigBee network performance characteristics such as energy efficiency, throughput, robustness, packet delay and delivery ratio in different scenarios and applications. This paper investigate and analyse the data accuracy, network implementation difficulties and security challenges of ZigBee network applications in star-based and mesh-based topologies with emphases on its home monitoring application using the ZigBee ProBee ZE-10 development boards for the network setup. The paper also expose some factors that need to be considered when designing ZigBee network applications and suggest ways in which ZigBee network can be designed to provide more resilient to network attacks.Keywords: home monitoring, IEEE 802.14.5, topology, wireless security, wireless sensor network (WSN), ZigBee
Procedia PDF Downloads 381754 A Damage Level Assessment Model for Extra High Voltage Transmission Towers
Authors: Huan-Chieh Chiu, Hung-Shuo Wu, Chien-Hao Wang, Yu-Cheng Yang, Ching-Ya Tseng, Joe-Air Jiang
Abstract:
Power failure resulting from tower collapse due to violent seismic events might bring enormous and inestimable losses. The Chi-Chi earthquake, for example, strongly struck Taiwan and caused huge damage to the power system on September 21, 1999. Nearly 10% of extra high voltage (EHV) transmission towers were damaged in the earthquake. Therefore, seismic hazards of EHV transmission towers should be monitored and evaluated. The ultimate goal of this study is to establish a damage level assessment model for EHV transmission towers. The data of earthquakes provided by Taiwan Central Weather Bureau serve as a reference and then lay the foundation for earthquake simulations and analyses afterward. Some parameters related to the damage level of each point of an EHV tower are simulated and analyzed by the data from monitoring stations once an earthquake occurs. Through the Fourier transform, the seismic wave is then analyzed and transformed into different wave frequencies, and the data would be shown through a response spectrum. With this method, the seismic frequency which damages EHV towers the most is clearly identified. An estimation model is built to determine the damage level caused by a future seismic event. Finally, instead of relying on visual observation done by inspectors, the proposed model can provide a power company with the damage information of a transmission tower. Using the model, manpower required by visual observation can be reduced, and the accuracy of the damage level estimation can be substantially improved. Such a model is greatly useful for health and construction monitoring because of the advantages of long-term evaluation of structural characteristics and long-term damage detection.Keywords: damage level monitoring, drift ratio, fragility curve, smart grid, transmission tower
Procedia PDF Downloads 298753 Alternating Expectation-Maximization Algorithm for a Bilinear Model in Isoform Quantification from RNA-Seq Data
Authors: Wenjiang Deng, Tian Mou, Yudi Pawitan, Trung Nghia Vu
Abstract:
Estimation of isoform-level gene expression from RNA-seq data depends on simplifying assumptions, such as uniform reads distribution, that are easily violated in real data. Such violations typically lead to biased estimates. Most existing methods provide a bias correction step(s), which is based on biological considerations, such as GC content–and applied in single samples separately. The main problem is that not all biases are known. For example, new technologies such as single-cell RNA-seq (scRNA-seq) may introduce new sources of bias not seen in bulk-cell data. This study introduces a method called XAEM based on a more flexible and robust statistical model. Existing methods are essentially based on a linear model Xβ, where the design matrix X is known and derived based on the simplifying assumptions. In contrast, XAEM considers Xβ as a bilinear model with both X and β unknown. Joint estimation of X and β is made possible by simultaneous analysis of multi-sample RNA-seq data. Compared to existing methods, XAEM automatically performs empirical correction of potentially unknown biases. XAEM implements an alternating expectation-maximization (AEM) algorithm, alternating between estimation of X and β. For speed XAEM utilizes quasi-mapping for read alignment, thus leading to a fast algorithm. Overall XAEM performs favorably compared to other recent advanced methods. For simulated datasets, XAEM obtains higher accuracy for multiple-isoform genes, particularly for paralogs. In a differential-expression analysis of a real scRNA-seq dataset, XAEM achieves substantially greater rediscovery rates in an independent validation set.Keywords: alternating EM algorithm, bias correction, bilinear model, gene expression, RNA-seq
Procedia PDF Downloads 140752 Loss Function Optimization for CNN-Based Fingerprint Anti-Spoofing
Authors: Yehjune Heo
Abstract:
As biometric systems become widely deployed, the security of identification systems can be easily attacked by various spoof materials. This paper contributes to finding a reliable and practical anti-spoofing method using Convolutional Neural Networks (CNNs) based on the types of loss functions and optimizers. The types of CNNs used in this paper include AlexNet, VGGNet, and ResNet. By using various loss functions including Cross-Entropy, Center Loss, Cosine Proximity, and Hinge Loss, and various loss optimizers which include Adam, SGD, RMSProp, Adadelta, Adagrad, and Nadam, we obtained significant performance changes. We realize that choosing the correct loss function for each model is crucial since different loss functions lead to different errors on the same evaluation. By using a subset of the Livdet 2017 database, we validate our approach to compare the generalization power. It is important to note that we use a subset of LiveDet and the database is the same across all training and testing for each model. This way, we can compare the performance, in terms of generalization, for the unseen data across all different models. The best CNN (AlexNet) with the appropriate loss function and optimizers result in more than 3% of performance gain over the other CNN models with the default loss function and optimizer. In addition to the highest generalization performance, this paper also contains the models with high accuracy associated with parameters and mean average error rates to find the model that consumes the least memory and computation time for training and testing. Although AlexNet has less complexity over other CNN models, it is proven to be very efficient. For practical anti-spoofing systems, the deployed version should use a small amount of memory and should run very fast with high anti-spoofing performance. For our deployed version on smartphones, additional processing steps, such as quantization and pruning algorithms, have been applied in our final model.Keywords: anti-spoofing, CNN, fingerprint recognition, loss function, optimizer
Procedia PDF Downloads 136751 The Utility of Sonographic Features of Lymph Nodes during EBUS-TBNA for Predicting Malignancy
Authors: Atefeh Abedini, Fatemeh Razavi, Mihan Pourabdollah Toutkaboni, Hossein Mehravaran, Arda Kiani
Abstract:
In countries with the highest prevalence of tuberculosis, such as Iran, the differentiation of malignant tumors from non-malignant is very important. In this study, which was conducted for the first time among the Iranian population, the utility of the ultrasonographic morphological characteristics in patients undergoing EBUS was used to distinguish the non-malignant versus malignant lymph nodes. The morphological characteristics of lymph nodes, which consist of size, shape, vascular pattern, echogenicity, margin, coagulation necrosis sign, calcification, and central hilar structure, were obtained during Endobronchial Ultrasound-Guided Trans-Bronchial Needle Aspiration and were compared with the final pathology results. During this study period, a total of 253 lymph nodes were evaluated in 93 cases. Round shape, non-hilar vascular pattern, heterogeneous echogenicity, hyperechogenicity, distinct margin, and the presence of necrosis sign were significantly higher in malignant nodes. On the other hand, the presence of calcification and also central hilar structure were significantly higher in the benign nodes (p-value ˂ 0.05). Multivariate logistic regression showed that size>1 cm, heterogeneous echogenicity, hyperechogenicity, the presence of necrosis signs and, the absence of central hilar structure are independent predictive factors for malignancy. The accuracy of each of the aforementioned factors is 42.29 %, 71.54 %, 71.90 %, 73.51 %, and 65.61 %, respectively. Of 74 malignant lymph nodes, 100% had at least one of these independent factors. According to our results, the morphological characteristics of lymph nodes based on Endobronchial Ultrasound-Guided Trans-Bronchial Needle Aspiration can play a role in the prediction of malignancy.Keywords: EBUS-TBNA, malignancy, nodal characteristics, pathology
Procedia PDF Downloads 134750 The Use of Unmanned Aerial System (UAS) in Improving the Measurement System on the Example of Textile Heaps
Authors: Arkadiusz Zurek
Abstract:
The potential of using drones is visible in many areas of logistics, especially in terms of their use for monitoring and control of many processes. The technologies implemented in the last decade concern new possibilities for companies that until now have not even considered them, such as warehouse inventories. Unmanned aerial vehicles are no longer seen as a revolutionary tool for Industry 4.0, but rather as tools in the daily work of factories and logistics operators. The research problem is to develop a method for measuring the weight of goods in a selected link of the clothing supply chain by drones. However, the purpose of this article is to analyze the causes of errors in traditional measurements, and then to identify adverse events related to the use of drones for the inventory of a heap of textiles intended for production purposes. On this basis, it will be possible to develop guidelines to eliminate the causes of these events in the measurement process using drones. In a real environment, work was carried out to determine the volume and weight of textiles, including, among others, weighing a textile sample to determine the average density of the assortment, establishing a local geodetic network, terrestrial laser scanning and photogrammetric raid using an unmanned aerial vehicle. As a result of the analysis of measurement data obtained in the facility, the volume and weight of the assortment and the accuracy of their determination were determined. In this article, this work presents how such heaps are currently being tested, what adverse events occur, indicate and describes the current use of photogrammetric techniques of this type of measurements so far performed by external drones for the inventory of wind farms or construction of the station and compare them with the measurement system of the aforementioned textile heap inside a large-format facility.Keywords: drones, unmanned aerial system, UAS, indoor system, security, process automation, cost optimization, photogrammetry, risk elimination, industry 4.0
Procedia PDF Downloads 84749 Efficacy of Phonological Awareness Intervention for People with Language Impairment
Authors: I. Wardana Ketut, I. Suparwa Nyoman
Abstract:
This study investigated the form and characteristic of speech sound produced by three Balinese subjects who have recovered from aphasia as well as intervened their language impairment on side of linguistic and neuronal aspects of views. The failure of judging the speech sound was caused by impairment of motor cortex that indicated there were lesions in left hemispheric language zone. Sound articulation phenomena were in the forms of phonemes deletion, replacement or assimilation in individual words and meaning building for anomic aphasia. Therefore, the Balinese sound patterns were stimulated by showing pictures to the subjects and recorded to recognize what individual consonants or vowels they unclearly produced and to find out how the sound disorder occurred. The physiology of sound production by subject’s speech organs could not only show the accuracy of articulation but also any level of severity the lesion they suffered from. The subjects’ speech sounds were investigated, classified and analyzed to know how poor the lingual units were and observed to clarify weaknesses of sound characters occurred either for place or manner of articulation. Many fricative and stopped consonants were replaced by glottal or palatal sounds because the cranial nerve, such as facial, trigeminal, and hypoglossal underwent impairment after the stroke. The phonological intervention was applied through a technique called phonemic articulation drill and the examination was conducted to know any change has been obtained. The finding informed that some weak articulation turned into clearer sound and simple meaning of language has been conveyed. The hierarchy of functional parts of brain played important role of language formulation and processing. From this finding, it can be clearly emphasized that this study supports the role of right hemisphere in recovery from aphasia is associated with functional brain reorganization.Keywords: aphasia, intervention, phonology, stroke
Procedia PDF Downloads 194748 Reliability Analysis of Construction Schedule Plan Based on Building Information Modelling
Authors: Lu Ren, You-Liang Fang, Yan-Gang Zhao
Abstract:
In recent years, the application of BIM (Building Information Modelling) to construction schedule plan has been the focus of more and more researchers. In order to assess the reasonable level of the BIM-based construction schedule plan, that is whether the schedule can be completed on time, some researchers have introduced reliability theory to evaluate. In the process of evaluation, the uncertain factors affecting the construction schedule plan are regarded as random variables, and probability distributions of the random variables are assumed to be normal distribution, which is determined using two parameters evaluated from the mean and standard deviation of statistical data. However, in practical engineering, most of the uncertain influence factors are not normal random variables. So the evaluation results of the construction schedule plan will be unreasonable under the assumption that probability distributions of random variables submitted to the normal distribution. Therefore, in order to get a more reasonable evaluation result, it is necessary to describe the distribution of random variables more comprehensively. For this purpose, cubic normal distribution is introduced in this paper to describe the distribution of arbitrary random variables, which is determined by the first four moments (mean, standard deviation, skewness and kurtosis). In this paper, building the BIM model firstly according to the design messages of the structure and making the construction schedule plan based on BIM, then the cubic normal distribution is used to describe the distribution of the random variables due to the collecting statistical data of the random factors influencing construction schedule plan. Next the reliability analysis of the construction schedule plan based on BIM can be carried out more reasonably. Finally, the more accurate evaluation results can be given providing reference for the implementation of the actual construction schedule plan. In the last part of this paper, the more efficiency and accuracy of the proposed methodology for the reliability analysis of the construction schedule plan based on BIM are conducted through practical engineering case.Keywords: BIM, construction schedule plan, cubic normal distribution, reliability analysis
Procedia PDF Downloads 146747 Microfungi on Sandy Beaches: Potential Threats for People Enjoying Lakeside Recreation
Authors: Tomasz Balabanski, Anna Biedunkiewicz
Abstract:
Research on basic bacteriological and physicochemical parameters conducted by state institutions (Provincial Sanitary and Epidemiological Station and District Sanitary and Epidemiological Station) are limited to bathing waters under constant sanitary and epidemiological supervision. Unfortunately, no routine or monitoring tests are carried out for the presence of microfungi. This also applies to beach sand used for recreational purposes. The purpose of the planned own research was to determine the diversity of the mycobiota present on supervised and unsupervised sandy beaches, on the shores of lakes, of municipal baths used for recreation. The research material consisted of microfungi isolated from April to October 2019 from sandy beaches of supervised and unsupervised lakes located within the administrative boundaries of the city of Olsztyn (North-Eastern Poland, Europe). Four lakes, out of the fifteen available (Tyrsko, Kortowskie, Skanda, and Ukiel), whose bathing waters are subjected to routine bacteriological tests, were selected for testing. To compare the diversity of the mycobiota composition on the surface and below the sand mixing layer, samples were taken from two depths (10 cm and 50 cm), using a soil auger. Micro-fungi from sand samples were obtained by surface inoculation on an RBC medium from the 1st dilution (1:10). After incubation at 25°C for 96-144 h, the average number of CFU/dm³ was counted. Morphologically differing yeast colonies were passaged into Sabouraud agar slants with gentamicin and incubated again. For detailed laboratory analyses, culture methods (macro- and micro-cultures) and identification methods recommended in diagnostic mycological laboratories were used. The conducted research allowed obtaining 140 yeast isolates. The total average population ranged from 1.37 × 10⁻² CFU/dm³ before the bathing season (April 2019), 1.64 × 10⁻³ CFU/dm³ in the season (May-September 2019), and 1.60 × 10⁻² CFU/dm³ after the end of the season (October 2019). More microfungi were obtained from the surface layer of sand (100 isolates) than from the deeper layer (40 isolates). Reported microfungi may circulate seasonally between individual elements of the lake ecosystem. From the sand/soil from the catchment area beaches, they can get into bathing waters, stopping periodically on the coastal phyllosphere. The sand of the beaches and the phyllosphere are a kind of filter for the water reservoir. The presence of microfungi with various pathogenicity potential in these places is of major epidemiological importance. Therefore, full monitoring of not only recreational waters but also sandy beaches should be treated as an element of constant control by appropriate supervisory institutions, allowing recreational areas for public use so that the use of these places does not involve the risk of infection. Acknowledgment: 'Development Program of the University of Warmia and Mazury in Olsztyn', POWR.03.05.00-00-Z310/17, co-financed by the European Union under the European Social Fund from the Operational Program Knowledge Education Development. Tomasz Bałabański is a recipient of a scholarship from the Programme Interdisciplinary Doctoral Studies in Biology and Biotechnology (POWR.03.05.00-00-Z310/17), which is funded by the 'European Social Fund'.Keywords: beach, microfungi, sand, yeasts
Procedia PDF Downloads 102746 Evaluation of Automated Analyzers of Polycyclic Aromatic Hydrocarbons and Black Carbon in a Coke Oven Plant by Comparison with Analytical Methods
Authors: L. Angiuli, L. Trizio, R. Giua, A. Digilio, M. Tutino, P. Dambruoso, F. Mazzone, C. M. Placentino
Abstract:
In the winter of 2014 a series of measurements were performed to evaluate the behavior of real-time PAHs and black carbon analyzers in a coke oven plant located in Taranto, a city of Southern Italy. Data were collected both insides than outside the plant, at air quality monitoring sites. Contemporary measures of PM2.5 and PM1 were performed. Particle-bound PAHs were measured by two methods: (1) aerosol photoionization using an Ecochem PAS 2000 analyzer, (2) PM2.5 and PM1 quartz filter collection and analysis by gas chromatography/mass spectrometry (GC/MS). Black carbon was determined both in real-time by Magee Aethalometer AE22 analyzer than by semi-continuous Sunset Lab EC/OC instrument. Detected PM2.5 and PM1 levels were higher inside than outside the plant while PAHs real-time values were higher outside than inside. As regards PAHs, inside the plant Ecochem PAS 2000 revealed concentrations not significantly different from those determined on the filter during low polluted days, but at increasing concentrations the automated instrument underestimated PAHs levels. At the external site, Ecochem PAS 2000 real-time concentrations were steadily higher than those on the filter. In the same way, real-time black carbon values were constantly lower than EC concentrations obtained by Sunset EC/OC in the inner site, while outside the plant real-time values were comparable to Sunset EC values. Results showed that in a coke plant real-time analyzers of PAHs and black carbon in the factory configuration provide qualitative information, with no accuracy and leading to the underestimation of the concentration. A site specific calibration is needed for these instruments before their installation in high polluted sites.Keywords: black carbon, coke oven plant, PAH, PAS, aethalometer
Procedia PDF Downloads 343745 Evaluating Structural Crack Propagation Induced by Soundless Chemical Demolition Agent Using an Energy Release Rate Approach
Authors: Shyaka Eugene
Abstract:
The efficient and safe demolition of structures is a critical challenge in civil engineering and construction. This study focuses on the development of optimal demolition strategies by investigating the crack propagation behavior in beams induced by soundless cracking agents. It is commonly used in controlled demolition and has gained prominence due to its non-explosive and environmentally friendly nature. This research employs a comprehensive experimental and computational approach to analyze the crack initiation, propagation, and eventual failure in beams subjected to soundless cracking agents. Experimental testing involves the application of various cracking agents under controlled conditions to understand their effects on the structural integrity of beams. High-resolution imaging and strain measurements are used to capture the crack propagation process. In parallel, numerical simulations are conducted using advanced finite element analysis (FEA) techniques to model crack propagation in beams, considering various parameters such as cracking agent composition, loading conditions, and beam properties. The FEA models are validated against experimental results, ensuring their accuracy in predicting crack propagation patterns. The findings of this study provide valuable insights into optimizing demolition strategies, allowing engineers and demolition experts to make informed decisions regarding the selection of cracking agents, their application techniques, and structural reinforcement methods. Ultimately, this research contributes to enhancing the safety, efficiency, and sustainability of demolition practices in the construction industry, reducing environmental impact and ensuring the protection of adjacent structures and the surrounding environment.Keywords: expansion pressure, energy release rate, soundless chemical demolition agent, crack propagation
Procedia PDF Downloads 61744 Finite Element Analysis for Earing Prediction Incorporating the BBC2003 Material Model with Fully Implicit Integration Method: Derivation and Numerical Algorithm
Authors: Sajjad Izadpanah, Seyed Hadi Ghaderi, Morteza Sayah Irani, Mahdi Gerdooei
Abstract:
In this research work, a sophisticated yield criterion known as BBC2003, capable of describing planar anisotropic behaviors of aluminum alloy sheets, was integrated into the commercial finite element code ABAQUS/Standard via a user subroutine. The complete formulation of the implementation process using a fully implicit integration scheme, i.e., the classic backward Euler method, is presented, and relevant aspects of the yield criterion are introduced. In order to solve nonlinear differential and algebraic equations, the line-search algorithm was adopted in the user-defined material subroutine (UMAT) to expand the convergence domain of the iterative Newton-Raphson method. The developed subroutine was used to simulate a challenging computational problem with complex stress states, i.e., deep drawing of an anisotropic aluminum alloy AA3105. The accuracy and stability of the developed subroutine were confirmed by comparing the numerically predicted earing and thickness variation profiles with the experimental results, which showed an excellent agreement between numerical and experimental earing and thickness profiles. The integration of the BBC2003 yield criterion into ABAQUS/Standard represents a significant contribution to the field of computational mechanics and provides a useful tool for analyzing the mechanical behavior of anisotropic materials subjected to complex loading conditions.Keywords: BBC2003 yield function, plastic anisotropy, fully implicit integration scheme, line search algorithm, explicit and implicit integration schemes
Procedia PDF Downloads 73743 Simulation-based Decision Making on Intra-hospital Patient Referral in a Collaborative Medical Alliance
Authors: Yuguang Gao, Mingtao Deng
Abstract:
The integration of independently operating hospitals into a unified healthcare service system has become a strategic imperative in the pursuit of hospitals’ high-quality development. Central to the concept of group governance over such transformation, exemplified by a collaborative medical alliance, is the delineation of shared value, vision, and goals. Given the inherent disparity in capabilities among hospitals within the alliance, particularly in the treatment of different diseases characterized by Disease Related Groups (DRG) in terms of effectiveness, efficiency and resource utilization, this study aims to address the centralized decision-making of intra-hospital patient referral within the medical alliance to enhance the overall production and quality of service provided. We first introduce the notion of production utility, where a higher production utility for a hospital implies better performance in treating patients diagnosed with that specific DRG group of diseases. Then, a Discrete-Event Simulation (DES) framework is established for patient referral among hospitals, where patient flow modeling incorporates a queueing system with fixed capacities for each hospital. The simulation study begins with a two-member alliance. The pivotal strategy examined is a "whether-to-refer" decision triggered when the bed usage rate surpasses a predefined threshold for either hospital. Then, the decision encompasses referring patients to the other hospital based on DRG groups’ production utility differentials as well as bed availability. The objective is to maximize the total production utility of the alliance while minimizing patients’ average length of stay and turnover rate. Thus the parameter under scrutiny is the bed usage rate threshold, influencing the efficacy of the referral strategy. Extending the study to a three-member alliance, which could readily be generalized to multi-member alliances, we maintain the core setup while introducing an additional “which-to-refer" decision that involves referring patients with specific DRG groups to the member hospital according to their respective production utility rankings. The overarching goal remains consistent, for which the bed usage rate threshold is once again a focal point for analysis. For the two-member alliance scenario, our simulation results indicate that the optimal bed usage rate threshold hinges on the discrepancy in the number of beds between member hospitals, the distribution of DRG groups among incoming patients, and variations in production utilities across hospitals. Transitioning to the three-member alliance, we observe similar dependencies on these parameters. Additionally, it becomes evident that an imbalanced distribution of DRG diagnoses and further disparity in production utilities among member hospitals may lead to an increase in the turnover rate. In general, it was found that the intra-hospital referral mechanism enhances the overall production utility of the medical alliance compared to individual hospitals without partnership. Patients’ average length of stay is also reduced, showcasing the positive impact of the collaborative approach. However, the turnover rate exhibits variability based on parameter setups, particularly when patients are redirected within the alliance. In conclusion, the re-structuring of diagnostic disease groups within the medical alliance proves instrumental in improving overall healthcare service outcomes, providing a compelling rationale for the government's promotion of patient referrals within collaborative medical alliances.Keywords: collaborative medical alliance, disease related group, patient referral, simulation
Procedia PDF Downloads 57742 Web Data Scraping Technology Using Term Frequency Inverse Document Frequency to Enhance the Big Data Quality on Sentiment Analysis
Authors: Sangita Pokhrel, Nalinda Somasiri, Rebecca Jeyavadhanam, Swathi Ganesan
Abstract:
Tourism is a booming industry with huge future potential for global wealth and employment. There are countless data generated over social media sites every day, creating numerous opportunities to bring more insights to decision-makers. The integration of Big Data Technology into the tourism industry will allow companies to conclude where their customers have been and what they like. This information can then be used by businesses, such as those in charge of managing visitor centers or hotels, etc., and the tourist can get a clear idea of places before visiting. The technical perspective of natural language is processed by analysing the sentiment features of online reviews from tourists, and we then supply an enhanced long short-term memory (LSTM) framework for sentiment feature extraction of travel reviews. We have constructed a web review database using a crawler and web scraping technique for experimental validation to evaluate the effectiveness of our methodology. The text form of sentences was first classified through Vader and Roberta model to get the polarity of the reviews. In this paper, we have conducted study methods for feature extraction, such as Count Vectorization and TFIDF Vectorization, and implemented Convolutional Neural Network (CNN) classifier algorithm for the sentiment analysis to decide the tourist’s attitude towards the destinations is positive, negative, or simply neutral based on the review text that they posted online. The results demonstrated that from the CNN algorithm, after pre-processing and cleaning the dataset, we received an accuracy of 96.12% for the positive and negative sentiment analysis.Keywords: counter vectorization, convolutional neural network, crawler, data technology, long short-term memory, web scraping, sentiment analysis
Procedia PDF Downloads 85741 Classifying Affective States in Virtual Reality Environments Using Physiological Signals
Authors: Apostolos Kalatzis, Ashish Teotia, Vishnunarayan Girishan Prabhu, Laura Stanley
Abstract:
Emotions are functional behaviors influenced by thoughts, stimuli, and other factors that induce neurophysiological changes in the human body. Understanding and classifying emotions are challenging as individuals have varying perceptions of their environments. Therefore, it is crucial that there are publicly available databases and virtual reality (VR) based environments that have been scientifically validated for assessing emotional classification. This study utilized two commercially available VR applications (Guided Meditation VR™ and Richie’s Plank Experience™) to induce acute stress and calm state among participants. Subjective and objective measures were collected to create a validated multimodal dataset and classification scheme for affective state classification. Participants’ subjective measures included the use of the Self-Assessment Manikin, emotional cards and 9 point Visual Analogue Scale for perceived stress, collected using a Virtual Reality Assessment Tool developed by our team. Participants’ objective measures included Electrocardiogram and Respiration data that were collected from 25 participants (15 M, 10 F, Mean = 22.28 4.92). The features extracted from these data included heart rate variability components and respiration rate, both of which were used to train two machine learning models. Subjective responses validated the efficacy of the VR applications in eliciting the two desired affective states; for classifying the affective states, a logistic regression (LR) and a support vector machine (SVM) with a linear kernel algorithm were developed. The LR outperformed the SVM and achieved 93.8%, 96.2%, 93.8% leave one subject out cross-validation accuracy, precision and recall, respectively. The VR assessment tool and data collected in this study are publicly available for other researchers.Keywords: affective computing, biosignals, machine learning, stress database
Procedia PDF Downloads 140740 Impacts of Urbanization on Forest and Agriculture Areas in Savannakhet Province, Lao People's Democratic Republic
Authors: Chittana Phompila
Abstract:
The current increased population pushes increasing demands for natural resources and living space. In Laos, urban areas have been expanding rapidly in recent years. The rapid urbanization can have negative impacts on landscapes, including forest and agriculture lands. The primary objective of this research were to map current urban areas in a large city in Savannakhet province, in Laos, 2) to compare changes in urbanization between 1990 and 2018, and 3) to estimate forest and agriculture areas lost due to expansions of urban areas during the last over twenty years within study area. Landsat 8 data was used and existing GIS data was collected including spatial data on rivers, lakes, roads, vegetated areas and other land use/land covers). GIS data was obtained from the government sectors. Object based classification (OBC) approach was applied in ECognition for image processing and analysis of urban area using. Historical data from other Landsat instruments (Landsat 5 and 7) were used to allow us comparing changes in urbanization in 1990, 2000, 2010 and 2018 in this study area. Only three main land cover classes were focused and classified, namely forest, agriculture and urban areas. Change detection approach was applied to illustrate changes in built-up areas in these periods. Our study shows that the overall accuracy of map was 95% assessed, kappa~ 0.8. It is found that that there is an ineffective control over forest and land-use conversions from forests and agriculture to urban areas in many main cities across the province. A large area of agriculture and forest has been decreased due to this conversion. Uncontrolled urban expansion and inappropriate land use planning can lead to creating a pressure in our resource utilisation. As consequence, it can lead to food insecurity and national economic downturn in a long term.Keywords: urbanisation, forest cover, agriculture areas, Landsat 8 imagery
Procedia PDF Downloads 157739 Load Forecasting in Microgrid Systems with R and Cortana Intelligence Suite
Authors: F. Lazzeri, I. Reiter
Abstract:
Energy production optimization has been traditionally very important for utilities in order to improve resource consumption. However, load forecasting is a challenging task, as there are a large number of relevant variables that must be considered, and several strategies have been used to deal with this complex problem. This is especially true also in microgrids where many elements have to adjust their performance depending on the future generation and consumption conditions. The goal of this paper is to present a solution for short-term load forecasting in microgrids, based on three machine learning experiments developed in R and web services built and deployed with different components of Cortana Intelligence Suite: Azure Machine Learning, a fully managed cloud service that enables to easily build, deploy, and share predictive analytics solutions; SQL database, a Microsoft database service for app developers; and PowerBI, a suite of business analytics tools to analyze data and share insights. Our results show that Boosted Decision Tree and Fast Forest Quantile regression methods can be very useful to predict hourly short-term consumption in microgrids; moreover, we found that for these types of forecasting models, weather data (temperature, wind, humidity and dew point) can play a crucial role in improving the accuracy of the forecasting solution. Data cleaning and feature engineering methods performed in R and different types of machine learning algorithms (Boosted Decision Tree, Fast Forest Quantile and ARIMA) will be presented, and results and performance metrics discussed.
Keywords: time-series, features engineering methods for forecasting, energy demand forecasting, Azure Machine Learning
Procedia PDF Downloads 295738 Somatic Delusional Disorder Subsequent to Phantogeusia: A Case Report
Authors: Pedro Felgueiras, Ana Miguel, Nélson Almeida, Raquel Silva
Abstract:
Objective: Through the study of a clinical case of delusional somatic disorder secondary to phantogeusia, we aim to highlight the importance of considering psychosomatic conditions in differential diagnosis, as well as to emphasize the complexity of its comprehension, treatment, and respective impact on patients’ functioning. Methods: Bearing this in mind, we conducted a critical analysis of a case series based on patient observations, clinical data, and complementary diagnostic methods, as well as a non-systematic review of the literature on the subject. Results: A 61-year-old female patient with no history of psychiatric conditions. Family psychiatric history of mood disorder (depression), with psychotic features found in her mother. Medical history of many comorbidities affecting different organ systems (endocrine, gastrointestinal, genitourinary, ophthalmological). Documented neuroticism traits of personality. The patient’s family described a persistent concern about several physical symptoms across her life, with a continuous effort to obtain explanations about any sensation out of her normal perception. Since being subjected to endoscopy in 2018, she started complaints of persistent phantogeusia (acid taste) and developed excessive thoughts, feelings, and behaviors associated with this somatic symptom. The patient was evaluated by several medical specialties, and an extensive panel of medical exams was carried out, excluding any disease. Besides all the investigation and with no evidence of disease signs, acute anxiety, time, and energy dispended to this symptom culminated in severe psychosocial impairment. The patient was admitted to a psychiatric ward for investigation and treatment of this clinical picture, leading to the diagnosis of the delusional somatic disorder. In order to exclude the acute organic etiology of this psychotic disorder, an analytic panel was carried out with no abnormal results. In the context of a psychotic clinical picture, a CT scan was performed, which revealed a right cortical vascular lesion. Neuropsychological evaluation was made, with the description of cognitive functioning being globally normative. During treatment with an antipsychotic (pimozide), a complete remission of the somatic delusion was associated with the disappearance of gustative perception disturbance. In follow-up, a relapse of gustative sensation was documented, and her thoughts and speech were dominated by concerns about multiple somatic symptoms. Conclusion: In terms of abnormal bodily sensations, the oral cavity is one of the frequent sites of delusional disorder. Patients with these gustatory perception distortions complain about unusual sensations without corresponding abnormal findings in the oral area. Its pathophysiology has not been fully elucidated yet. In terms of its comprehensive psychopathology, this case was hypothesized as a paranoid development of a delusional somatic disorder triggered by a post-invasive procedure phantogeusia (which is described as a possible side effect of an endoscopy) in a patient with an anankastic personality. This case presents interesting psychopathology, reinforcing the complexity of psychosomatic disorders in terms of their etiopathogenesis, clinical treatment, and long-term prognosis.Keywords: psychosomatics, delusional somatic disorder, phantogeusia, paranoid development
Procedia PDF Downloads 127737 Dermoscopy Compliance: Improving Melanoma Detection Pathways Through Quality Improvement
Authors: Max Butler
Abstract:
Melanoma accounts for 80% of skin cancer-related deaths globally. The poor prognosis and increasing incidence of melanoma impose a significant burden on global healthcare systems. Early detection, precise diagnosis, and preventative strategies are critical to improving patient outcomes. Dermoscopy is the gold standard for specialist assessments of pigmented skin lesions, as it can differentiate between benign and malignant growths with greater accuracy than visual inspection. In the United Kingdom, guidelines from the National Institute of Clinical Excellence (NICE) state dermoscopy should be used in all specialist assessments of pigmented skin lesions. Compliance with this guideline is low, resulting in missed and delayed melanoma diagnoses. To address this problem, a quality improvement project was initiated at Buckinghamshire Healthcare Trust (BHT) within the plastic surgery department. The target group was a trainee and consultant plastic surgeons conducting outpatient skin cancer clinics. Analysis of clinic documentation over a one-month period found that only 62% (38/61) of patients referred with pigmented skin lesions were examined using dermoscopy. To increase dermoscopy rates, teaching was delivered to the department highlighting national guidelines and the evidence base for dermoscopic examination. In addition, clinic paperwork was redesigned to include a text box for dermoscopic examination. Reauditing after the intervention found a significant increase in dermoscopy rates (52/61, p = 0.014). In conclusion, implementing a quality improvement project with targeted teaching and documentation template templates successfully increased dermoscopy rates. This is a promising step toward improving early melanoma detection and patient outcomes.Keywords: melanoma, dermoscopy, plastic surgery, quality improvement
Procedia PDF Downloads 69736 DMBR-Net: Deep Multiple-Resolution Bilateral Networks for Real-Time and Accurate Semantic Segmentation
Authors: Pengfei Meng, Shuangcheng Jia, Qian Li
Abstract:
We proposed a real-time high-precision semantic segmentation network based on a multi-resolution feature fusion module, the auxiliary feature extracting module, upsampling module, and atrous spatial pyramid pooling (ASPP) module. We designed a feature fusion structure, which is integrated with sufficient features of different resolutions. We also studied the effect of side-branch structure on the network and made discoveries. Based on the discoveries about the side-branch of the network structure, we used a side-branch auxiliary feature extraction layer in the network to improve the effectiveness of the network. We also designed upsampling module, which has better results than the original upsampling module. In addition, we also re-considered the locations and number of atrous spatial pyramid pooling (ASPP) modules and modified the network structure according to the experimental results to further improve the effectiveness of the network. The network presented in this paper takes the backbone network of Bisenetv2 as a basic network, based on which we constructed a network structure on which we made improvements. We named this network deep multiple-resolution bilateral networks for real-time, referred to as DMBR-Net. After experimental testing, our proposed DMBR-Net network achieved 81.2% mIoU at 119FPS on the Cityscapes validation dataset, 80.7% mIoU at 109FPS on the CamVid test dataset, 29.9% mIoU at 78FPS on the COCOStuff test dataset. Compared with all lightweight real-time semantic segmentation networks, our network achieves the highest accuracy at an appropriate speed.Keywords: multi-resolution feature fusion, atrous convolutional, bilateral networks, pyramid pooling
Procedia PDF Downloads 149735 Assessing Denitrification-Disintegration Model’s Efficacy in Simulating Greenhouse Gas Emissions, Crop Growth, Yield, and Soil Biochemical Processes in Moroccan Context
Authors: Mohamed Boullouz, Mohamed Louay Metougui
Abstract:
Accurate modeling of greenhouse gas (GHG) emissions, crop growth, soil productivity, and biochemical processes is crucial considering escalating global concerns about climate change and the urgent need to improve agricultural sustainability. The application of the denitrification-disintegration (DNDC) model in the context of Morocco's unique agro-climate is thoroughly investigated in this study. Our main research hypothesis is that the DNDC model offers an effective and powerful tool for precisely simulating a wide range of significant parameters, including greenhouse gas emissions, crop growth, yield potential, and complex soil biogeochemical processes, all consistent with the intricate features of environmental Moroccan agriculture. In order to verify these hypotheses, a vast amount of field data covering Morocco's various agricultural regions and encompassing a range of soil types, climatic factors, and crop varieties had to be gathered. These experimental data sets will serve as the foundation for careful model calibration and subsequent validation, ensuring the accuracy of simulation results. In conclusion, the prospective research findings add to the global conversation on climate-resilient agricultural practices while encouraging the promotion of sustainable agricultural models in Morocco. A policy architect's and an agricultural actor's ability to make informed decisions that not only advance food security but also environmental stability may be strengthened by the impending recognition of the DNDC model as a potent simulation tool tailored to Moroccan conditions.Keywords: greenhouse gas emissions, DNDC model, sustainable agriculture, Moroccan cropping systems
Procedia PDF Downloads 63734 Hybridization of Manually Extracted and Convolutional Features for Classification of Chest X-Ray of COVID-19
Authors: M. Bilal Ishfaq, Adnan N. Qureshi
Abstract:
COVID-19 is the most infectious disease these days, it was first reported in Wuhan, the capital city of Hubei in China then it spread rapidly throughout the whole world. Later on 11 March 2020, the World Health Organisation (WHO) declared it a pandemic. Since COVID-19 is highly contagious, it has affected approximately 219M people worldwide and caused 4.55M deaths. It has brought the importance of accurate diagnosis of respiratory diseases such as pneumonia and COVID-19 to the forefront. In this paper, we propose a hybrid approach for the automated detection of COVID-19 using medical imaging. We have presented the hybridization of manually extracted and convolutional features. Our approach combines Haralick texture features and convolutional features extracted from chest X-rays and CT scans. We also employ a minimum redundancy maximum relevance (MRMR) feature selection algorithm to reduce computational complexity and enhance classification performance. The proposed model is evaluated on four publicly available datasets, including Chest X-ray Pneumonia, COVID-19 Pneumonia, COVID-19 CTMaster, and VinBig data. The results demonstrate high accuracy and effectiveness, with 0.9925 on the Chest X-ray pneumonia dataset, 0.9895 on the COVID-19, Pneumonia and Normal Chest X-ray dataset, 0.9806 on the Covid CTMaster dataset, and 0.9398 on the VinBig dataset. We further evaluate the effectiveness of the proposed model using ROC curves, where the AUC for the best-performing model reaches 0.96. Our proposed model provides a promising tool for the early detection and accurate diagnosis of COVID-19, which can assist healthcare professionals in making informed treatment decisions and improving patient outcomes. The results of the proposed model are quite plausible and the system can be deployed in a clinical or research setting to assist in the diagnosis of COVID-19.Keywords: COVID-19, feature engineering, artificial neural networks, radiology images
Procedia PDF Downloads 75733 An Investigation of the Effects of Gripping Systems in Geosynthetic Shear Testing
Authors: Charles Sikwanda
Abstract:
The use of geosynthetic materials in geotechnical engineering projects has rapidly increased over the past several years. These materials have resulted in improved performance and cost reduction of geotechnical structures as compared to the use of conventional materials. However, working with geosynthetics requires knowledge of interface parameters for design. These parameters are typically determined by the large direct shear device in accordance with ASTM-D5321 and ASTM-D6243 standards. Although these laboratory tests are standardized, the quality of the results can be largely affected by several factors that include; the shearing rate, applied normal stress, gripping mechanism, and type of the geosynthetic specimens tested. Amongst these factors, poor surface gripping of a specimen is the major source of the discrepancy. If the specimen is inadequately secured to the shearing blocks, it experiences progressive failure and shear strength that deviates from the true field performance of the tested material. This leads to inaccurate, unsafe, and cost ineffective designs of projects. Currently, the ASTM-D5321 and ASTM-D6243 standards do not provide a standardized gripping system for geosynthetic shear strength testing. Over the years, researchers have come up with different gripping systems that can be used such as; glue, metal textured surface, sandblasting, and sandpaper. However, these gripping systems are regularly not adequate to sufficiently secure the tested specimens to the shearing device. This has led to large variability in test results and difficulties in results interpretation. Therefore, this study was aimed at determining the effects of gripping systems in geosynthetic interface shear strength testing using a 300 x 300 mm direct shear box. The results of the research will contribute to easy data interpretation and increase result accuracy and reproducibility.Keywords: geosynthetics, shear strength parameters, gripping systems, gripping
Procedia PDF Downloads 202732 Comparison of Agree Method and Shortest Path Method for Determining the Flow Direction in Basin Morphometric Analysis: Case Study of Lower Tapi Basin, Western India
Authors: Jaypalsinh Parmar, Pintu Nakrani, Bhaumik Shah
Abstract:
Digital Elevation Model (DEM) is elevation data of the virtual grid on the ground. DEM can be used in application in GIS such as hydrological modelling, flood forecasting, morphometrical analysis and surveying etc.. For morphometrical analysis the stream flow network plays a very important role. DEM lacks accuracy and cannot match field data as it should for accurate results of morphometrical analysis. The present study focuses on comparing the Agree method and the conventional Shortest path method for finding out morphometric parameters in the flat region of the Lower Tapi Basin which is located in the western India. For the present study, open source SRTM (Shuttle Radar Topography Mission with 1 arc resolution) and toposheets issued by Survey of India (SOI) were used to determine the morphometric linear aspect such as stream order, number of stream, stream length, bifurcation ratio, mean stream length, mean bifurcation ratio, stream length ratio, length of overland flow, constant of channel maintenance and aerial aspect such as drainage density, stream frequency, drainage texture, form factor, circularity ratio, elongation ratio, shape factor and relief aspect such as relief ratio, gradient ratio and basin relief for 53 catchments of Lower Tapi Basin. Stream network was digitized from the available toposheets. Agree DEM was created by using the SRTM and stream network from the toposheets. The results obtained were used to demonstrate a comparison between the two methods in the flat areas.Keywords: agree method, morphometric analysis, lower Tapi basin, shortest path method
Procedia PDF Downloads 237731 Simulation and Analysis of Passive Parameters of Building in eQuest: A Case Study in Istanbul, Turkey
Authors: Mahdiyeh Zafaranchi
Abstract:
With rapid development of urbanization and improvement of living standards in the world, energy consumption and carbon emissions of the building sector are expected to increase in the near future; because of that, energy-saving issues have become more important among the engineers. Besides, the building sector is a major contributor to energy consumption and carbon emissions. The concept of efficient building appeared as a response to the need for reducing energy demand in this sector which has the main purpose of shifting from standard buildings to low-energy buildings. Although energy-saving should happen in all steps of a building during the life cycle (material production, construction, demolition), the main concept of efficient energy building is saving energy during the life expectancy of a building by using passive and active systems, and should not sacrifice comfort and quality to reach these goals. The main aim of this study is to investigate passive strategies (do not need energy consumption or use renewable energy) to achieve energy-efficient buildings. Energy retrofit measures were explored by eQuest software using a case study as a base model. The study investigates predictive accuracy for the major factors like thermal transmittance (U-value) of the material, windows, shading devices, thermal insulation, rate of the exposed envelope, window/wall ration, lighting system in the energy consumption of the building. The base model was located in Istanbul, Turkey. The impact of eight passive parameters on energy consumption had been indicated. After analyzing the base model by eQuest, a final scenario was suggested which had a good energy performance. The results showed a decrease in the U-values of materials, the rate of exposing buildings, and windows had a significant effect on energy consumption. Finally, savings in electric consumption of about 10.5%, and gas consumption by about 8.37% in the suggested model were achieved annually.Keywords: efficient building, electric and gas consumption, eQuest, Passive parameters
Procedia PDF Downloads 110730 Artificial Intelligence Based Method in Identifying Tumour Infiltrating Lymphocytes of Triple Negative Breast Cancer
Authors: Nurkhairul Bariyah Baharun, Afzan Adam, Reena Rahayu Md Zin
Abstract:
Tumor microenvironment (TME) in breast cancer is mainly composed of cancer cells, immune cells, and stromal cells. The interaction between cancer cells and their microenvironment plays an important role in tumor development, progression, and treatment response. The TME in breast cancer includes tumor-infiltrating lymphocytes (TILs) that are implicated in killing tumor cells. TILs can be found in tumor stroma (sTILs) and within the tumor (iTILs). TILs in triple negative breast cancer (TNBC) have been demonstrated to have prognostic and potentially predictive value. The international Immune-Oncology Biomarker Working Group (TIL-WG) had developed a guideline focus on the assessment of sTILs using hematoxylin and eosin (H&E)-stained slides. According to the guideline, the pathologists use “eye balling” method on the H&E stained- slide for sTILs assessment. This method has low precision, poor interobserver reproducibility, and is time-consuming for a comprehensive evaluation, besides only counted sTILs in their assessment. The TIL-WG has therefore recommended that any algorithm for computational assessment of TILs utilizing the guidelines provided to overcome the limitations of manual assessment, thus providing highly accurate and reliable TILs detection and classification for reproducible and quantitative measurement. This study is carried out to develop a TNBC digital whole slide image (WSI) dataset from H&E-stained slides and IHC (CD4+ and CD8+) stained slides. TNBC cases were retrieved from the database of the Department of Pathology, Hospital Canselor Tuanku Muhriz (HCTM). TNBC cases diagnosed between the year 2010 and 2021 with no history of other cancer and available block tissue were included in the study (n=58). Tissue blocks were sectioned approximately 4 µm for H&E and IHC stain. The H&E staining was performed according to a well-established protocol. Indirect IHC stain was also performed on the tissue sections using protocol from Diagnostic BioSystems PolyVue™ Plus Kit, USA. The slides were stained with rabbit monoclonal, CD8 antibody (SP16) and Rabbit monoclonal, CD4 antibody (EP204). The selected and quality-checked slides were then scanned using a high-resolution whole slide scanner (Pannoramic DESK II DW- slide scanner) to digitalize the tissue image with a pixel resolution of 20x magnification. A manual TILs (sTILs and iTILs) assessment was then carried out by the appointed pathologist (2 pathologists) for manual TILs scoring from the digital WSIs following the guideline developed by TIL-WG 2014, and the result displayed as the percentage of sTILs and iTILs per mm² stromal and tumour area on the tissue. Following this, we aimed to develop an automated digital image scoring framework that incorporates key elements of manual guidelines (including both sTILs and iTILs) using manually annotated data for robust and objective quantification of TILs in TNBC. From the study, we have developed a digital dataset of TNBC H&E and IHC (CD4+ and CD8+) stained slides. We hope that an automated based scoring method can provide quantitative and interpretable TILs scoring, which correlates with the manual pathologist-derived sTILs and iTILs scoring and thus has potential prognostic implications.Keywords: automated quantification, digital pathology, triple negative breast cancer, tumour infiltrating lymphocytes
Procedia PDF Downloads 114729 Differential Approach to Technology Aided English Language Teaching: A Case Study in a Multilingual Setting
Authors: Sweta Sinha
Abstract:
Rapid evolution of technology has changed language pedagogy as well as perspectives on language use, leading to strategic changes in discourse studies. We are now firmly embedded in a time when digital technologies have become an integral part of our daily lives. This has led to generalized approaches to English Language Teaching (ELT) which has raised two-pronged concerns in linguistically diverse settings: a) the diverse linguistic background of the learner might interfere/ intervene with the learning process and b) the differential level of already acquired knowledge of target language might make the classroom practices too easy or too difficult for the target group of learners. ELT needs a more systematic and differential pedagogical approach for greater efficiency and accuracy. The present research analyses the need of identifying learner groups based on different levels of target language proficiency based on a longitudinal study done on 150 undergraduate students. The learners were divided into five groups based on their performance on a twenty point scale in Listening Speaking Reading and Writing (LSRW). The groups were then subjected to varying durations of technology aided language learning sessions and their performance was recorded again on the same scale. Identifying groups and introducing differential teaching and learning strategies led to better results compared to generalized teaching strategies. Language teaching includes different aspects: the organizational, the technological, the sociological, the psychological, the pedagogical and the linguistic. And a facilitator must account for all these aspects in a carefully devised differential approach meeting the challenge of learner diversity. Apart from the justification of the formation of differential groups the paper attempts to devise framework to account for all these aspects in order to make ELT in multilingual setting much more effective.Keywords: differential groups, English language teaching, language pedagogy, multilingualism, technology aided language learning
Procedia PDF Downloads 390728 Electrochemical Biosensor for the Detection of Botrytis spp. in Temperate Legume Crops
Authors: Marzia Bilkiss, Muhammad J. A. Shiddiky, Mostafa K. Masud, Prabhakaran Sambasivam, Ido Bar, Jeremy Brownlie, Rebecca Ford
Abstract:
A greater achievement in the Integrated Disease Management (IDM) to prevent the loss would result from early diagnosis and quantitation of the causal pathogen species for accurate and timely disease control. This could significantly reduce costs to the growers and reduce any flow on impacts to the environment from excessive chemical spraying. Necrotrophic fungal disease botrytis grey mould, caused by Botrytis cinerea and Botrytis fabae, significantly reduce temperate legume yield and grain quality during favourable environmental condition in Australia and worldwide. Several immunogenic and molecular probe-type protocols have been developed for their diagnosis, but these have varying levels of species-specificity, sensitivity, and consequent usefulness within the paddock. To substantially improve speed, accuracy, and sensitivity, advanced nanoparticle-based biosensor approaches have been developed. For this, two sets of primers were designed for both Botrytis cinerea and Botrytis fabae which have shown the species specificity with initial sensitivity of two genomic copies/µl in pure fungal backgrounds using multiplexed quantitative PCR. During further validation, quantitative PCR detected 100 spores on artificially infected legume leaves. Simultaneously an electro-catalytic assay was developed for both target fungal DNA using functionalised magnetic nanoparticles. This was extremely sensitive, able to detect a single spore within a raw total plant nucleic acid extract background. We believe that the translation of this technology to the field will enable quantitative assessment of pathogen load for future accurate decision support of informed botrytis grey mould management.Keywords: biosensor, botrytis grey mould, sensitive, species specific
Procedia PDF Downloads 172727 Integration of Big Data to Predict Transportation for Smart Cities
Authors: Sun-Young Jang, Sung-Ah Kim, Dongyoun Shin
Abstract:
The Intelligent transportation system is essential to build smarter cities. Machine learning based transportation prediction could be highly promising approach by delivering invisible aspect visible. In this context, this research aims to make a prototype model that predicts transportation network by using big data and machine learning technology. In detail, among urban transportation systems this research chooses bus system. The research problem that existing headway model cannot response dynamic transportation conditions. Thus, bus delay problem is often occurred. To overcome this problem, a prediction model is presented to fine patterns of bus delay by using a machine learning implementing the following data sets; traffics, weathers, and bus statues. This research presents a flexible headway model to predict bus delay and analyze the result. The prototyping model is composed by real-time data of buses. The data are gathered through public data portals and real time Application Program Interface (API) by the government. These data are fundamental resources to organize interval pattern models of bus operations as traffic environment factors (road speeds, station conditions, weathers, and bus information of operating in real-time). The prototyping model is designed by the machine learning tool (RapidMiner Studio) and conducted tests for bus delays prediction. This research presents experiments to increase prediction accuracy for bus headway by analyzing the urban big data. The big data analysis is important to predict the future and to find correlations by processing huge amount of data. Therefore, based on the analysis method, this research represents an effective use of the machine learning and urban big data to understand urban dynamics.Keywords: big data, machine learning, smart city, social cost, transportation network
Procedia PDF Downloads 260726 The Importance of Artificial Intelligence in Various Healthcare Applications
Authors: Joshna Rani S., Ahmadi Banu
Abstract:
Artificial Intelligence (AI) has a significant task to carry out in the medical care contributions of things to come. As AI, it is the essential capacity behind the advancement of accuracy medication, generally consented to be a painfully required development in care. Albeit early endeavors at giving analysis and treatment proposals have demonstrated testing, we anticipate that AI will at last dominate that area too. Given the quick propels in AI for imaging examination, it appears to be likely that most radiology, what's more, pathology pictures will be inspected eventually by a machine. Discourse and text acknowledgment are now utilized for assignments like patient correspondence and catch of clinical notes, and their utilization will increment. The best test to AI in these medical services areas isn't regardless of whether the innovations will be sufficiently skilled to be valuable, but instead guaranteeing their appropriation in day by day clinical practice. For far reaching selection to happen, AI frameworks should be affirmed by controllers, coordinated with EHR frameworks, normalized to an adequate degree that comparative items work likewise, instructed to clinicians, paid for by open or private payer associations, and refreshed over the long haul in the field. These difficulties will, at last, be survived, yet they will take any longer to do as such than it will take for the actual innovations to develop. Therefore, we hope to see restricted utilization of AI in clinical practice inside 5 years and more broad use inside 10 years. It likewise appears to be progressively evident that AI frameworks won't supplant human clinicians for a huge scope, yet rather will increase their endeavors to really focus on patients. Over the long haul, human clinicians may advance toward errands and work plans that draw on remarkably human abilities like sympathy, influence, and higher perspective mix. Maybe the lone medical services suppliers who will chance their professions over the long run might be the individuals who will not work close by AIKeywords: artificial intellogence, health care, breast cancer, AI applications
Procedia PDF Downloads 181