Search results for: data combining
24840 A Data Mining Approach for Analysing and Predicting the Bank's Asset Liability Management Based on Basel III Norms
Authors: Nidhin Dani Abraham, T. K. Sri Shilpa
Abstract:
Asset liability management is an important aspect in banking business. Moreover, the today’s banking is based on BASEL III which strictly regulates on the counterparty default. This paper focuses on prediction and analysis of counter party default risk, which is a type of risk occurs when the customers fail to repay the amount back to the lender (bank or any financial institutions). This paper proposes an approach to reduce the counterparty risk occurring in the financial institutions using an appropriate data mining technique and thus predicts the occurrence of NPA. It also helps in asset building and restructuring quality. Liability management is very important to carry out banking business. To know and analyze the depth of liability of bank, a suitable technique is required. For that a data mining technique is being used to predict the dormant behaviour of various deposit bank customers. Various models are implemented and the results are analyzed of saving bank deposit customers. All these data are cleaned using data cleansing approach from the bank data warehouse.Keywords: data mining, asset liability management, BASEL III, banking
Procedia PDF Downloads 55024839 A Dynamic Ensemble Learning Approach for Online Anomaly Detection in Alibaba Datacenters
Authors: Wanyi Zhu, Xia Ming, Huafeng Wang, Junda Chen, Lu Liu, Jiangwei Jiang, Guohua Liu
Abstract:
Anomaly detection is a first and imperative step needed to respond to unexpected problems and to assure high performance and security in large data center management. This paper presents an online anomaly detection system through an innovative approach of ensemble machine learning and adaptive differentiation algorithms, and applies them to performance data collected from a continuous monitoring system for multi-tier web applications running in Alibaba data centers. We evaluate the effectiveness and efficiency of this algorithm with production traffic data and compare with the traditional anomaly detection approaches such as a static threshold and other deviation-based detection techniques. The experiment results show that our algorithm correctly identifies the unexpected performance variances of any running application, with an acceptable false positive rate. This proposed approach has already been deployed in real-time production environments to enhance the efficiency and stability in daily data center operations.Keywords: Alibaba data centers, anomaly detection, big data computation, dynamic ensemble learning
Procedia PDF Downloads 19824838 Formulation and Evaluation of Silver Nanoparticles as Drug Carrier for Cancer Therapy
Authors: Abdelhadi Adam Salih Denei
Abstract:
Silver nanoparticles (AgNPs) have been used in cancer therapy, and the area of nanomedicine has made unheard-of strides in recent years. A thorough summary of the development and assessment of AgNPs for their possible use in the fight against cancer is the goal of this review. Targeted delivery methods have been designed to optimise therapeutic efficacy by using AgNPs' distinct physicochemical features, such as their size, shape, and surface chemistry. Firstly, the study provides an overview of the several synthesis routes—both chemical and green—that are used to create AgNPs. Natural extracts and biomolecules are used in green synthesis techniques, which are becoming more and more popular since they are biocompatible and environmentally benign. It is next described how synthesis factors affect the physicochemical properties of AgNPs, emphasising how crucial it is to modify these parameters for particular therapeutic uses. An extensive analysis is conducted on the anticancer potential of AgNPs, emphasising their capacity to trigger apoptosis, impede angiogenesis, and alter cellular signalling pathways. The analysis also investigates the potential benefits of combining AgNPs with currently used cancer treatment techniques, including radiation and chemotherapy. AgNPs' safety profile for use in clinical settings is clarified by a comprehensive evaluation of their cytotoxicity and biocompatibility.Keywords: silver nanoparticles, cancer, nanocarrier system, targeted delivery
Procedia PDF Downloads 6224837 Experimental Study of the Electrical Conductivity and Thermal Conductivity Property of Micro-based Al-Cu-Nb-Mo Alloy
Abstract:
Aluminum based alloys with a certain compositional blend and manufacturing method have been reported to have excellent electrical conductors. In the current investigation, metal powders of Aluminum (Al), Copper (Cu), Niobium (Nb), and Molybdenum (Mo) were weighed in accordance with certain ratios and spread equally by combining the powder particles. The metal particles were mixed using a tube mixer for 12 hours. Before pouring into a 30mm-diameter graphite mold, pre-pressed, and placed into an SPS furnace, the thermal conductivity of the mixed metal powders was evaluated using a portable Thermtest device. Axial pressure of 50 MPa was used at a heating rate of 50 oC/min, and a multi-stage heating procedure with a holding period of 10 min. was used to sinter at temperatures between 300 oC and 480 oC. After being cooled to room temperature, the specimens were unmolded to produce the aluminum, copper, niobium, and molybdenum alloy material. The HPS 2662 Precision Four-point Probe Meter was used to determine the electrical resistivity and the values used to calculate the electrical conductivity of the sintered alloy samples. Finally, the alloy with the highest electrical conductivity and thermal conductivity qualities was the one with the following composition: Al 93.5Cu4Nb1.5Mo1. It also had a density of 3.23 g/cm3. It could be advisable for usage in automobile radiator and electric transmission line components.Keywords: Al-Cu-Nb-Mo, electrical conductivity, alloy, sintering, thermal conductivity
Procedia PDF Downloads 8924836 Vaccine Development for Newcastle Disease Virus in Poultry
Authors: Muhammad Asif Rasheed
Abstract:
Newcastle disease virus (NDV), an avian orthoavulavirus, is a causative agent of Newcastle disease named (NDV) and can cause even the epidemics when the disease is not treated. Previously several vaccines based on attenuated and inactivated viruses have been reported, which are rendered useless with the passage of time due to versatile changes in viral genome. Therefore, we aimed to develop an effective multi-epitope vaccine against the haemagglutinin neuraminidase (HN) protein of 26 NDV strains from Pakistan through a modern immunoinformatic approaches. As a result, a vaccine chimaera was constructed by combining T-cell and B-cell epitopes with the appropriate linkers and adjuvant. The designed vaccine was highly immunogenic, non-allergen, and antigenic; therefore, the potential 3D-structureof multi epitope vaccine was constructed, refined, and validated. A molecular docking study of a multiepitope vaccine candidate with the chicken Toll-like receptor-4 indicated successful binding. An In silico immunological simulation was used to evaluate the candidate vaccine's ability to elicit an effective immune response. According to the computational studies, the proposed multiepitope vaccine is physically stable and may induce immune responses, whichsuggested it a strong candidate against 26 Newcastle disease virus strains from Pakistan. A wet lab study is under process to confirm the results.Keywords: epitopes, newcastle disease virus, paramyxovirus virus, vaccine
Procedia PDF Downloads 11724835 Unsupervised Text Mining Approach to Early Warning System
Authors: Ichihan Tai, Bill Olson, Paul Blessner
Abstract:
Traditional early warning systems that alarm against crisis are generally based on structured or numerical data; therefore, a system that can make predictions based on unstructured textual data, an uncorrelated data source, is a great complement to the traditional early warning systems. The Chicago Board Options Exchange (CBOE) Volatility Index (VIX), commonly referred to as the fear index, measures the cost of insurance against market crash, and spikes in the event of crisis. In this study, news data is consumed for prediction of whether there will be a market-wide crisis by predicting the movement of the fear index, and the historical references to similar events are presented in an unsupervised manner. Topic modeling-based prediction and representation are made based on daily news data between 1990 and 2015 from The Wall Street Journal against VIX index data from CBOE.Keywords: early warning system, knowledge management, market prediction, topic modeling.
Procedia PDF Downloads 33524834 The Role of Synthetic Data in Aerial Object Detection
Authors: Ava Dodd, Jonathan Adams
Abstract:
The purpose of this study is to explore the characteristics of developing a machine learning application using synthetic data. The study is structured to develop the application for the purpose of deploying the computer vision model. The findings discuss the realities of attempting to develop a computer vision model for practical purpose, and detail the processes, tools, and techniques that were used to meet accuracy requirements. The research reveals that synthetic data represents another variable that can be adjusted to improve the performance of a computer vision model. Further, a suite of tools and tuning recommendations are provided.Keywords: computer vision, machine learning, synthetic data, YOLOv4
Procedia PDF Downloads 22124833 Perception-Oriented Model Driven Development for Designing Data Acquisition Process in Wireless Sensor Networks
Authors: K. Indra Gandhi
Abstract:
Wireless Sensor Networks (WSNs) have always been characterized for application-specific sensing, relaying and collection of information for further analysis. However, software development was not considered as a separate entity in this process of data collection which has posed severe limitations on the software development for WSN. Software development for WSN is a complex process since the components involved are data-driven, network-driven and application-driven in nature. This implies that there is a tremendous need for the separation of concern from the software development perspective. A layered approach for developing data acquisition design based on Model Driven Development (MDD) has been proposed as the sensed data collection process itself varies depending upon the application taken into consideration. This work focuses on the layered view of the data acquisition process so as to ease the software point of development. A metamodel has been proposed that enables reusability and realization of the software development as an adaptable component for WSN systems. Further, observing users perception indicates that proposed model helps in improving the programmer's productivity by realizing the collaborative system involved.Keywords: data acquisition, model-driven development, separation of concern, wireless sensor networks
Procedia PDF Downloads 43424832 Design and Implementation of Image Super-Resolution for Myocardial Image
Authors: M. V. Chidananda Murthy, M. Z. Kurian, H. S. Guruprasad
Abstract:
Super-resolution is the technique of intelligently upscaling images, avoiding artifacts or blurring, and deals with the recovery of a high-resolution image from one or more low-resolution images. Single-image super-resolution is a process of obtaining a high-resolution image from a set of low-resolution observations by signal processing. While super-resolution has been demonstrated to improve image quality in scaled down images in the image domain, its effects on the Fourier-based technique remains unknown. Super-resolution substantially improved the spatial resolution of the patient LGE images by sharpening the edges of the heart and the scar. This paper aims at investigating the effects of single image super-resolution on Fourier-based and image based methods of scale-up. In this paper, first, generate a training phase of the low-resolution image and high-resolution image to obtain dictionary. In the test phase, first, generate a patch and then difference of high-resolution image and interpolation image from the low-resolution image. Next simulation of the image is obtained by applying convolution method to the dictionary creation image and patch extracted the image. Finally, super-resolution image is obtained by combining the fused image and difference of high-resolution and interpolated image. Super-resolution reduces image errors and improves the image quality.Keywords: image dictionary creation, image super-resolution, LGE images, patch extraction
Procedia PDF Downloads 37324831 Comparative Analysis of Data Gathering Protocols with Multiple Mobile Elements for Wireless Sensor Network
Authors: Bhat Geetalaxmi Jairam, D. V. Ashoka
Abstract:
Wireless Sensor Networks are used in many applications to collect sensed data from different sources. Sensed data has to be delivered through sensors wireless interface using multi-hop communication towards the sink. The data collection in wireless sensor networks consumes energy. Energy consumption is the major constraints in WSN .Reducing the energy consumption while increasing the amount of generated data is a great challenge. In this paper, we have implemented two data gathering protocols with multiple mobile sinks/elements to collect data from sensor nodes. First, is Energy-Efficient Data Gathering with Tour Length-Constrained Mobile Elements in Wireless Sensor Networks (EEDG), in which mobile sinks uses vehicle routing protocol to collect data. Second is An Intelligent Agent-based Routing Structure for Mobile Sinks in WSNs (IAR), in which mobile sinks uses prim’s algorithm to collect data. Authors have implemented concepts which are common to both protocols like deployment of mobile sinks, generating visiting schedule, collecting data from the cluster member. Authors have compared the performance of both protocols by taking statistics based on performance parameters like Delay, Packet Drop, Packet Delivery Ratio, Energy Available, Control Overhead. Authors have concluded this paper by proving EEDG is more efficient than IAR protocol but with few limitations which include unaddressed issues likes Redundancy removal, Idle listening, Mobile Sink’s pause/wait state at the node. In future work, we plan to concentrate more on these limitations to avail a new energy efficient protocol which will help in improving the life time of the WSN.Keywords: aggregation, consumption, data gathering, efficiency
Procedia PDF Downloads 49724830 Capacity for Care: A Management Model for Increasing Animal Live Release Rates, Reducing Animal Intake and Euthanasia Rates in an Australian Open Admission Animal Shelter
Authors: Ann Enright
Abstract:
More than ever, animal shelters need to identify ways to reduce the number of animals entering shelter facilities and the incidence of euthanasia. Managing animal overpopulation using euthanasia can have detrimental health and emotional consequences for the shelter staff involved. There are also community expectations with moral and financial implications to consider. To achieve the goals of reducing animal intake and the incidence of euthanasia, shelter best practice involves combining programs, procedures and partnerships to increase live release rates (LRR), reduce the incidence of disease, length of stay (LOS) and shelter intake whilst overall remaining financially viable. Analysing daily processes, tracking outcomes and implementing simple strategies enabled shelter staff to more effectively focus their efforts and achieve amazing results. The objective of this retrospective study was to assess the effect of implementing the capacity for care (C4C) management model. Data focusing on the average daily number of animals on site for a two year period (2016 – 2017) was exported from a shelter management system, Customer Logic (CL) Vet to Excel for manipulation and comparison. Following the implementation of C4C practices the average daily number of animals on site was reduced by >50%, (2016 average 103 compared to 2017 average 49), average LOS reduced by 50% from 8 weeks to 4 weeks and incidence of disease reduced from ≥ 70% to less than 2% of the cats on site at the completion of the study. The total number of stray cats entering the shelter due to council contracts reduced by 50% (486 to 248). Improved cat outcomes were attributed to strategies that increased adoptions and reduced euthanasia of poorly socialized cats, including foster programs. To continue to achieve improvements in LRR and LOS, strategies to decrease intake further would be beneficial, for example, targeted sterilisation programs. In conclusion, the study highlighted the benefits of using C4C as a management tool, delivering a significant reduction in animal intake and euthanasia with positive emotional, financial and community outcomes.Keywords: animal welfare, capacity for care, cat, euthanasia, length of stay, managed intake, shelter
Procedia PDF Downloads 13724829 Status and Results from EXO-200
Authors: Ryan Maclellan
Abstract:
EXO-200 has provided one of the most sensitive searches for neutrinoless double-beta decay utilizing 175 kg of enriched liquid xenon in an ultra-low background time projection chamber. This detector has demonstrated excellent energy resolution and background rejection capabilities. Using the first two years of data, EXO-200 has set a limit of 1.1x10^25 years at 90% C.L. on the neutrinoless double-beta decay half-life of Xe-136. The experiment has experienced a brief hiatus in data taking during a temporary shutdown of its host facility: the Waste Isolation Pilot Plant. EXO-200 expects to resume data taking in earnest this fall with upgraded detector electronics. Results from the analysis of EXO-200 data and an update on the current status of EXO-200 will be presented.Keywords: double-beta, Majorana, neutrino, neutrinoless
Procedia PDF Downloads 41324828 Combined Optical Coherence Microscopy and Spectrally Resolved Multiphoton Microscopy
Authors: Bjorn-Ole Meyer, Dominik Marti, Peter E. Andersen
Abstract:
A multimodal imaging system, combining spectrally resolved multiphoton microscopy (MPM) and optical coherence microscopy (OCM) is demonstrated. MPM and OCM are commonly integrated into multimodal imaging platforms to combine functional and morphological information. The MPM signals, such as two-photon fluorescence emission (TPFE) and signals created by second harmonic generation (SHG) are biomarkers which exhibit information on functional biological features such as the ratio of pyridine nucleotide (NAD(P)H) and flavin adenine dinucleotide (FAD) in the classification of cancerous tissue. While the spectrally resolved imaging allows for the study of biomarkers, using a spectrometer as a detector limits the imaging speed of the system significantly. To overcome those limitations, an OCM setup was added to the system, which allows for fast acquisition of structural information. Thus, after rapid imaging of larger specimens, navigation within the sample is possible. Subsequently, distinct features can be selected for further investigation using MPM. Additionally, by probing a different contrast, complementary information is obtained, and different biomarkers can be investigated. OCM images of tissue and cell samples are obtained, and distinctive features are evaluated using MPM to illustrate the benefits of the system.Keywords: optical coherence microscopy, multiphoton microscopy, multimodal imaging, two-photon fluorescence emission
Procedia PDF Downloads 51024827 Remaining Useful Life (RUL) Assessment Using Progressive Bearing Degradation Data and ANN Model
Authors: Amit R. Bhende, G. K. Awari
Abstract:
Remaining useful life (RUL) prediction is one of key technologies to realize prognostics and health management that is being widely applied in many industrial systems to ensure high system availability over their life cycles. The present work proposes a data-driven method of RUL prediction based on multiple health state assessment for rolling element bearings. Bearing degradation data at three different conditions from run to failure is used. A RUL prediction model is separately built in each condition. Feed forward back propagation neural network models are developed for prediction modeling.Keywords: bearing degradation data, remaining useful life (RUL), back propagation, prognosis
Procedia PDF Downloads 43424826 Spatio-Temporal Data Mining with Association Rules for Lake Van
Authors: Tolga Aydin, M. Fatih Alaeddinoğlu
Abstract:
People, throughout the history, have made estimates and inferences about the future by using their past experiences. Developing information technologies and the improvements in the database management systems make it possible to extract useful information from knowledge in hand for the strategic decisions. Therefore, different methods have been developed. Data mining by association rules learning is one of such methods. Apriori algorithm, one of the well-known association rules learning algorithms, is not commonly used in spatio-temporal data sets. However, it is possible to embed time and space features into the data sets and make Apriori algorithm a suitable data mining technique for learning spatio-temporal association rules. Lake Van, the largest lake of Turkey, is a closed basin. This feature causes the volume of the lake to increase or decrease as a result of change in water amount it holds. In this study, evaporation, humidity, lake altitude, amount of rainfall and temperature parameters recorded in Lake Van region throughout the years are used by the Apriori algorithm and a spatio-temporal data mining application is developed to identify overflows and newly-formed soil regions (underflows) occurring in the coastal parts of Lake Van. Identifying possible reasons of overflows and underflows may be used to alert the experts to take precautions and make the necessary investments.Keywords: apriori algorithm, association rules, data mining, spatio-temporal data
Procedia PDF Downloads 37224825 Building Data Infrastructure for Public Use and Informed Decision Making in Developing Countries-Nigeria
Authors: Busayo Fashoto, Abdulhakeem Shaibu, Justice Agbadu, Samuel Aiyeoribe
Abstract:
Data has gone from just rows and columns to being an infrastructure itself. The traditional medium of data infrastructure has been managed by individuals in different industries and saved on personal work tools; one of such is the laptop. This hinders data sharing and Sustainable Development Goal (SDG) 9 for infrastructure sustainability across all countries and regions. However, there has been a constant demand for data across different agencies and ministries by investors and decision-makers. The rapid development and adoption of open-source technologies that promote the collection and processing of data in new ways and in ever-increasing volumes are creating new data infrastructure in sectors such as lands and health, among others. This paper examines the process of developing data infrastructure and, by extension, a data portal to provide baseline data for sustainable development and decision making in Nigeria. This paper employs the FAIR principle (Findable, Accessible, Interoperable, and Reusable) of data management using open-source technology tools to develop data portals for public use. eHealth Africa, an organization that uses technology to drive public health interventions in Nigeria, developed a data portal which is a typical data infrastructure that serves as a repository for various datasets on administrative boundaries, points of interest, settlements, social infrastructure, amenities, and others. This portal makes it possible for users to have access to datasets of interest at any point in time at no cost. A skeletal infrastructure of this data portal encompasses the use of open-source technology such as Postgres database, GeoServer, GeoNetwork, and CKan. These tools made the infrastructure sustainable, thus promoting the achievement of SDG 9 (Industries, Innovation, and Infrastructure). As of 6th August 2021, a wider cross-section of 8192 users had been created, 2262 datasets had been downloaded, and 817 maps had been created from the platform. This paper shows the use of rapid development and adoption of technologies that facilitates data collection, processing, and publishing in new ways and in ever-increasing volumes. In addition, the paper is explicit on new data infrastructure in sectors such as health, social amenities, and agriculture. Furthermore, this paper reveals the importance of cross-sectional data infrastructures for planning and decision making, which in turn can form a central data repository for sustainable development across developing countries.Keywords: data portal, data infrastructure, open source, sustainability
Procedia PDF Downloads 9724824 Process Data-Driven Representation of Abnormalities for Efficient Process Control
Authors: Hyun-Woo Cho
Abstract:
Unexpected operational events or abnormalities of industrial processes have a serious impact on the quality of final product of interest. In terms of statistical process control, fault detection and diagnosis of processes is one of the essential tasks needed to run the process safely. In this work, nonlinear representation of process measurement data is presented and evaluated using a simulation process. The effect of using different representation methods on the diagnosis performance is tested in terms of computational efficiency and data handling. The results have shown that the nonlinear representation technique produced more reliable diagnosis results and outperforms linear methods. The use of data filtering step improved computational speed and diagnosis performance for test data sets. The presented scheme is different from existing ones in that it attempts to extract the fault pattern in the reduced space, not in the original process variable space. Thus this scheme helps to reduce the sensitivity of empirical models to noise.Keywords: fault diagnosis, nonlinear technique, process data, reduced spaces
Procedia PDF Downloads 24524823 Text-to-Speech in Azerbaijani Language via Transfer Learning in a Low Resource Environment
Authors: Dzhavidan Zeinalov, Bugra Sen, Firangiz Aslanova
Abstract:
Most text-to-speech models cannot operate well in low-resource languages and require a great amount of high-quality training data to be considered good enough. Yet, with the improvements made in ASR systems, it is now much easier than ever to collect data for the design of custom text-to-speech models. In this work, our work on using the ASR model to collect data to build a viable text-to-speech system for one of the leading financial institutions of Azerbaijan will be outlined. NVIDIA’s implementation of the Tacotron 2 model was utilized along with the HiFiGAN vocoder. As for the training, the model was first trained with high-quality audio data collected from the Internet, then fine-tuned on the bank’s single speaker call center data. The results were then evaluated by 50 different listeners and got a mean opinion score of 4.17, displaying that our method is indeed viable. With this, we have successfully designed the first text-to-speech model in Azerbaijani and publicly shared 12 hours of audiobook data for everyone to use.Keywords: Azerbaijani language, HiFiGAN, Tacotron 2, text-to-speech, transfer learning, whisper
Procedia PDF Downloads 4324822 Basic Study of Mammographic Image Magnification System with Eye-Detector and Simple EEG Scanner
Authors: Aika Umemuro, Mitsuru Sato, Mizuki Narita, Saya Hori, Saya Sakurai, Tomomi Nakayama, Ayano Nakazawa, Toshihiro Ogura
Abstract:
Mammography requires the detection of very small calcifications, and physicians search for microcalcifications by magnifying the images as they read them. The mouse is necessary to zoom in on the images, but this can be tiring and distracting when many images are read in a single day. Therefore, an image magnification system combining an eye-detector and a simple electroencephalograph (EEG) scanner was devised, and its operability was evaluated. Two experiments were conducted in this study: the measurement of eye-detection error using an eye-detector and the measurement of the time required for image magnification using a simple EEG scanner. Eye-detector validation showed that the mean distance of eye-detection error ranged from 0.64 cm to 2.17 cm, with an overall mean of 1.24 ± 0.81 cm for the observers. The results showed that the eye detection error was small enough for the magnified area of the mammographic image. The average time required for point magnification in the verification of the simple EEG scanner ranged from 5.85 to 16.73 seconds, and individual differences were observed. The reason for this may be that the size of the simple EEG scanner used was not adjustable, so it did not fit well for some subjects. The use of a simple EEG scanner with size adjustment would solve this problem. Therefore, the image magnification system using the eye-detector and the simple EEG scanner is useful.Keywords: EEG scanner, eye-detector, mammography, observers
Procedia PDF Downloads 21424821 An Empirical Evaluation of Performance of Machine Learning Techniques on Imbalanced Software Quality Data
Authors: Ruchika Malhotra, Megha Khanna
Abstract:
The development of change prediction models can help the software practitioners in planning testing and inspection resources at early phases of software development. However, a major challenge faced during the training process of any classification model is the imbalanced nature of the software quality data. A data with very few minority outcome categories leads to inefficient learning process and a classification model developed from the imbalanced data generally does not predict these minority categories correctly. Thus, for a given dataset, a minority of classes may be change prone whereas a majority of classes may be non-change prone. This study explores various alternatives for adeptly handling the imbalanced software quality data using different sampling methods and effective MetaCost learners. The study also analyzes and justifies the use of different performance metrics while dealing with the imbalanced data. In order to empirically validate different alternatives, the study uses change data from three application packages of open-source Android data set and evaluates the performance of six different machine learning techniques. The results of the study indicate extensive improvement in the performance of the classification models when using resampling method and robust performance measures.Keywords: change proneness, empirical validation, imbalanced learning, machine learning techniques, object-oriented metrics
Procedia PDF Downloads 41824820 Variance-Aware Routing and Authentication Scheme for Harvesting Data in Cloud-Centric Wireless Sensor Networks
Authors: Olakanmi Oladayo Olufemi, Bamifewe Olusegun James, Badmus Yaya Opeyemi, Adegoke Kayode
Abstract:
The wireless sensor network (WSN) has made a significant contribution to the emergence of various intelligent services or cloud-based applications. Most of the time, these data are stored on a cloud platform for efficient management and sharing among different services or users. However, the sensitivity of the data makes them prone to various confidentiality and performance-related attacks during and after harvesting. Various security schemes have been developed to ensure the integrity and confidentiality of the WSNs' data. However, their specificity towards particular attacks and the resource constraint and heterogeneity of WSNs make most of these schemes imperfect. In this paper, we propose a secure variance-aware routing and authentication scheme with two-tier verification to collect, share, and manage WSN data. The scheme is capable of classifying WSN into different subnets, detecting any attempt of wormhole and black hole attack during harvesting, and enforcing access control on the harvested data stored in the cloud. The results of the analysis showed that the proposed scheme has more security functionalities than other related schemes, solves most of the WSNs and cloud security issues, prevents wormhole and black hole attacks, identifies the attackers during data harvesting, and enforces access control on the harvested data stored in the cloud at low computational, storage, and communication overheads.Keywords: data block, heterogeneous IoT network, data harvesting, wormhole attack, blackhole attack access control
Procedia PDF Downloads 8124819 Pulmonary Disease Identification Using Machine Learning and Deep Learning Techniques
Authors: Chandu Rathnayake, Isuri Anuradha
Abstract:
Early detection and accurate diagnosis of lung diseases play a crucial role in improving patient prognosis. However, conventional diagnostic methods heavily rely on subjective symptom assessments and medical imaging, often causing delays in diagnosis and treatment. To overcome this challenge, we propose a novel lung disease prediction system that integrates patient symptoms and X-ray images to provide a comprehensive and reliable diagnosis.In this project, develop a mobile application specifically designed for detecting lung diseases. Our application leverages both patient symptoms and X-ray images to facilitate diagnosis. By combining these two sources of information, our application delivers a more accurate and comprehensive assessment of the patient's condition, minimizing the risk of misdiagnosis. Our primary aim is to create a user-friendly and accessible tool, particularly important given the current circumstances where many patients face limitations in visiting healthcare facilities. To achieve this, we employ several state-of-the-art algorithms. Firstly, the Decision Tree algorithm is utilized for efficient symptom-based classification. It analyzes patient symptoms and creates a tree-like model to predict the presence of specific lung diseases. Secondly, we employ the Random Forest algorithm, which enhances predictive power by aggregating multiple decision trees. This ensemble technique improves the accuracy and robustness of the diagnosis. Furthermore, we incorporate a deep learning model using Convolutional Neural Network (CNN) with the RestNet50 pre-trained model. CNNs are well-suited for image analysis and feature extraction. By training CNN on a large dataset of X-ray images, it learns to identify patterns and features indicative of lung diseases. The RestNet50 architecture, known for its excellent performance in image recognition tasks, enhances the efficiency and accuracy of our deep learning model. By combining the outputs of the decision tree-based algorithms and the deep learning model, our mobile application generates a comprehensive lung disease prediction. The application provides users with an intuitive interface to input their symptoms and upload X-ray images for analysis. The prediction generated by the system offers valuable insights into the likelihood of various lung diseases, enabling individuals to take appropriate actions and seek timely medical attention. Our proposed mobile application has significant potential to address the rising prevalence of lung diseases, particularly among young individuals with smoking addictions. By providing a quick and user-friendly approach to assessing lung health, our application empowers individuals to monitor their well-being conveniently. This solution also offers immense value in the context of limited access to healthcare facilities, enabling timely detection and intervention. In conclusion, our research presents a comprehensive lung disease prediction system that combines patient symptoms and X-ray images using advanced algorithms. By developing a mobile application, we provide an accessible tool for individuals to assess their lung health conveniently. This solution has the potential to make a significant impact on the early detection and management of lung diseases, benefiting both patients and healthcare providers.Keywords: CNN, random forest, decision tree, machine learning, deep learning
Procedia PDF Downloads 7224818 A Systematic Literature Review on Changing Customer Requirements for Sustainable Design over Time
Authors: Lara F. Horani
Abstract:
Design is one of the most important stages in the process of product development. Product design has experienced significant changes over the years ranging from concentrating on cost and performance to combining economic, environmental and social considerations in customer requirements. Its evolution is in accordance with rapidly changing technology, economic situations, and climate change and environmental issues, as well as social context. Within product design, sustainability is a concept that balances economic, social and environmental aspects. This research aims to express changes in customer requirements over time from the viewpoint of sustainable design. It does so by systematically reviewing a broad scope of sustainable design literature. There is a need for a model to consider the changes that take place in customer requirements over time to build a successful relationship with customers which has been presented. Today’s literature does very little to even mention it, let alone present any progress in it. Systematic literature reviews are conducted primarily to: summarize the existing literature around a subject, highlight commonalities to build consensus, illuminate differences, identify gaps that can be filled, provide a background to position future research, and build a framework that can help designers meet the challenges of sustainable design.Keywords: sustainable design, customer requirements for sustainable design, systematic literature reviews, changing customer requirements
Procedia PDF Downloads 37224817 Quality of Age Reporting from Tanzania 2012 Census Results: An Assessment Using Whipple’s Index, Myer’s Blended Index, and Age-Sex Accuracy Index
Authors: A. Sathiya Susuman, Hamisi F. Hamisi
Abstract:
Background: Many socio-economic and demographic data are age-sex attributed. However, a variety of irregularities and misstatement are noted with respect to age-related data and less to sex data because of its biological differences between the genders. Noting the misstatement/misreporting of age data regardless of its significance importance in demographics and epidemiological studies, this study aims at assessing the quality of 2012 Tanzania Population and Housing Census Results. Methods: Data for the analysis are downloaded from Tanzania National Bureau of Statistics. Age heaping and digit preference were measured using summary indices viz., Whipple’s index, Myers’ blended index, and Age-Sex Accuracy index. Results: The recorded Whipple’s index for both sexes was 154.43; male has the lowest index of about 152.65 while female has the highest index of about 156.07. For Myers’ blended index, the preferences were at digits ‘0’ and ‘5’ while avoidance were at digits ‘1’ and ‘3’ for both sexes. Finally, Age-sex index stood at 59.8 where sex ratio score was 5.82 and age ratio scores were 20.89 and 21.4 for males and female respectively. Conclusion: The evaluation of the 2012 PHC data using the demographic techniques has qualified the data inaccurate as the results of systematic heaping and digit preferences/avoidances. Thus, innovative methods in data collection along with measuring and minimizing errors using statistical techniques should be used to ensure accuracy of age data.Keywords: age heaping, digit preference/avoidance, summary indices, Whipple’s index, Myer’s index, age-sex accuracy index
Procedia PDF Downloads 47424816 Flexural Properties of RC Beams Strengthened with A Composite Reinforcement Layer: FRP Grid and ECC
Authors: Yu-Zhou Zheng, Wen-Wei Wang
Abstract:
In this paper, a new strengthening technique for reinforced concrete (RC) beams is proposed by combining Basalt Fibre Reinforced Polymer (BFRP) grid and Engineered Cementitious Composites (ECC) as a composite reinforcement layer (CRL). Five RC beams externally bonded with the CRL at the soffit and one control RC beam was tested to investigate their flexural behaviour. The thickness of BFRP grids (i.e., 1mm, 3mm and 5mm) and the sizes of CRL in test program were selected as the test parameters, while the thickness of CRL was fixed approximately at 30mm. The test results showed that there is no debonding of CRL to occur obviously in the strengthened beams. The final failure modes were the concrete crushing or the rupture of BFRP grids, indicating that the proposed technique is effective in suppressing the debonding of externally bonded materials and fully utilizing the material strengths. Compared with the non-strengthened beam, the increments of crack loading for strengthened beams were 58%~97%, 15%~35% for yield loading and 4%~33% for the ultimate loading, respectively. An analytical model is also presented to predict the full-range load-deflection responses of the strengthened beams and validated through comparisons with the test results.Keywords: basalt fiber-reinforced polymer (BFRP) grid, ECC, RC beams, strengthening
Procedia PDF Downloads 34624815 Mimicking of Various ECM Tangible Cues for the Manipulation of Hepatocellular Behaviours
Authors: S. A. Abdellatef, A. Taniguchi, Namiki, Tsukuba, Ibaraki
Abstract:
The alterations in the physicochemical characteristics of bio-materials are renowned for their impact in cellular behaviors. Surface chemistry and substratum topography are separately considered as mutable characteristics with deep impact on the overall cell behaviors. In our recent work, we examined the manipulation of the physical cues on hepatic cellular behaviors. We have proven that the geometrical or dimensional characteristics of nano features are essential for the optimum hepatocellular functions. While here, the collective impact of both physical and chemical cues on hepatocellular behaviors was investigated. On which RGD peptide was immobilized on a TiO2 nano pattern that imitates the hierarchically extend collagen nano fibrillar structures. The hepatocytes morphological and functional changes induced by simultaneously combining the diversified cues were investigated. TiO2 substrates that integrate nano topography with the adhesive peptide motif (RGD) had showed an increase in the hepatocellular functionality to the maximum extent. While a significant enhancement in expression of these liver specific markers on RGD coated surfaces were observed compared to uncoated substrates regardless of topography. Consequently in depth understanding of the relationship between various kind of cues and hepatocytes behaviors would be a paving step in the application of tissue engineering and bio reactor technology.Keywords: biomaterial, tiO2, hepG2, RGD
Procedia PDF Downloads 39224814 Model for Introducing Products to New Customers through Decision Tree Using Algorithm C4.5 (J-48)
Authors: Komol Phaisarn, Anuphan Suttimarn, Vitchanan Keawtong, Kittisak Thongyoun, Chaiyos Jamsawang
Abstract:
This article is intended to analyze insurance information which contains information on the customer decision when purchasing life insurance pay package. The data were analyzed in order to present new customers with Life Insurance Perfect Pay package to meet new customers’ needs as much as possible. The basic data of insurance pay package were collect to get data mining; thus, reducing the scattering of information. The data were then classified in order to get decision model or decision tree using Algorithm C4.5 (J-48). In the classification, WEKA tools are used to form the model and testing datasets are used to test the decision tree for the accurate decision. The validation of this model in classifying showed that the accurate prediction was 68.43% while 31.25% were errors. The same set of data were then tested with other models, i.e. Naive Bayes and Zero R. The results showed that J-48 method could predict more accurately. So, the researcher applied the decision tree in writing the program used to introduce the product to new customers to persuade customers’ decision making in purchasing the insurance package that meets the new customers’ needs as much as possible.Keywords: decision tree, data mining, customers, life insurance pay package
Procedia PDF Downloads 42524813 Experimental and Numerical Investigation on Deformation Behaviour of Single Crystal Copper
Authors: Suman Paik, P. V. Durgaprasad, Bijan K. Dutta
Abstract:
A study combining experimental and numerical investigation on the deformation behaviour of single crystals of copper is presented in this paper. Cylindrical samples were cut in specific orientations from high purity copper single crystal and subjected to uniaxial compression loading at quasi-static strain rate. The stress-strain curves along two different crystallographic orientations were then extracted. In order to study and compare the deformation responses, a single crystal plasticity model incorporating non-Schmid effects was developed assuming cross-slip plays an important role in orientation of the material. By making use of crystal plasticity finite element method, the model was applied to investigate the orientation dependence of the stress-strain behaviour of two crystallographic orientations. Finally, details of slip activities of deformed crystals were investigated by linking the orientation of slip lines with the theoretical traces of possible crystallographic planes. The experimentally determined active slip modes were matched with those determined by simulations.Keywords: crystal plasticity, modelling, non-Schmid effects, finite elements, finite strain
Procedia PDF Downloads 21224812 The Women-In-Mining Discourse: A Study Combining Corpus Linguistics and Discourse Analysis
Authors: Ylva Fältholm, Cathrine Norberg
Abstract:
One of the major threats identified to successful future mining is that women do not find the industry attractive. Many attempts have been made, for example in Sweden and Australia, to create organizational structures and mining communities attractive to both genders. Despite such initiatives, many mining areas are developing into gender-segregated fly-in/fly out communities dominated by men with both social and economic consequences. One of the challenges facing many mining companies is thus to break traditional gender patterns and structures. To do this increased knowledge about gender in the context of mining is needed. Since language both constitutes and reproduces knowledge, increased knowledge can be gained through an exploration and description of the mining discourse from a gender perspective. The aim of this study is to explore what conceptual ideas are activated in connection to the physical/geographical mining area and to work within the mining industry. We use a combination of critical discourse analysis implying close reading of selected texts, such as policy documents, interview materials, applications and research and innovation agendas, and analyses of linguistic patterns found in large language corpora covering millions of words of contemporary language production. The quantitative corpus data serves as a point of departure for the qualitative analysis of the texts, that is, suggests what patterns to explore further. The study shows that despite technological and organizational development, one of the most persistent discourses about mining is the conception of dangerous and unfriendly areas infused with traditional notions of masculinity ideals and manual hard work. Although some of the texts analyzed highlight gender issues, and describe gender-equalizing initiatives, such as wage-mapping systems, female networks and recruitment efforts for women executives, and thereby render the discourse less straightforward, it is shown that these texts are not unambiguous examples of a counter-discourse. They rather illustrate that discourses are not stable but include opposing discourses, in dialogue with each other. For example, many texts highlight why and how women are important to mining, at the same time as they suggest that gender and diversity are all about women: why mining is a problem for them, how they should be, and what they should do to fit in. Drawing on a constitutive view of discourse, knowledge about such conflicting perceptions of women is a prerequisite for succeeding in attracting women to the mining industry and thereby contributing to the development of future mining.Keywords: discourse, corpus linguistics, gender, mining
Procedia PDF Downloads 26324811 Internal Product Management: The Key to Achieving Digital Maturity and Business Agility for Manufacturing IT Organizations
Authors: Frederick Johnson
Abstract:
Product management has a long and well-established history within the consumer goods industry, despite being one of the most obscure aspects of brand management. Many global manufacturing organizations are now opting for external cloud-based Manufacturing Execution Systems (MES) to replace costly and outdated monolithic MES solutions. Other global manufacturing leaders are restructuring their organizations to support human-centered values, agile methodologies, and fluid operating principles. Still, industry-leading organizations struggle to apply the appropriate framework for managing evolving external MES solutions as internal "digital products." Product management complements these current trends in technology and philosophical thinking in the market. This paper discusses the central problems associated with adopting product management processes by analyzing its traditional theories and characteristics. Considering these ideas, the article then constructs a translated internal digital product management framework by combining new and existing approaches and principles. The report concludes by demonstrating the framework's capabilities and potential effectiveness in achieving digital maturity and business agility within a manufacturing environment.Keywords: internal product management, digital transformation, manufacturing information technology, manufacturing execution systems
Procedia PDF Downloads 133