Search results for: corpus-driven approach
12343 Continuous Functions Modeling with Artificial Neural Network: An Improvement Technique to Feed the Input-Output Mapping
Authors: A. Belayadi, A. Mougari, L. Ait-Gougam, F. Mekideche-Chafa
Abstract:
The artificial neural network is one of the interesting techniques that have been advantageously used to deal with modeling problems. In this study, the computing with artificial neural network (CANN) is proposed. The model is applied to modulate the information processing of one-dimensional task. We aim to integrate a new method which is based on a new coding approach of generating the input-output mapping. The latter is based on increasing the neuron unit in the last layer. Accordingly, to show the efficiency of the approach under study, a comparison is made between the proposed method of generating the input-output set and the conventional method. The results illustrated that the increasing of the neuron units, in the last layer, allows to find the optimal network’s parameters that fit with the mapping data. Moreover, it permits to decrease the training time, during the computation process, which avoids the use of computers with high memory usage.Keywords: neural network computing, continuous functions generating the input-output mapping, decreasing the training time, machines with big memories
Procedia PDF Downloads 28412342 Spatial-Temporal Awareness Approach for Extensive Re-Identification
Authors: Tyng-Rong Roan, Fuji Foo, Wenwey Hseush
Abstract:
Recent development of AI and edge computing plays a critical role to capture meaningful events such as detection of an unattended bag. One of the core problems is re-identification across multiple CCTVs. Immediately following the detection of a meaningful event is to track and trace the objects related to the event. In an extensive environment, the challenge becomes severe when the number of CCTVs increases substantially, imposing difficulties in achieving high accuracy while maintaining real-time performance. The algorithm that re-identifies cross-boundary objects for extensive tracking is referred to Extensive Re-Identification, which emphasizes the issues related to the complexity behind a great number of CCTVs. The Spatial-Temporal Awareness approach challenges the conventional thinking and concept of operations which is labor intensive and time consuming. The ability to perform Extensive Re-Identification through a multi-sensory network provides the next-level insights – creating value beyond traditional risk management.Keywords: long-short-term memory, re-identification, security critical application, spatial-temporal awareness
Procedia PDF Downloads 11412341 The Effect of Artificial Intelligence on Human Rights Resources and Development
Authors: Tharwat Girgis Farag Girgis
Abstract:
The link between development and human rights has long been the subject of scholarly debate. As a result, a number of principles have been adopted, from the right to development to the human rights-based development approach, to understand the dynamics between the two concepts. Despite the initiatives taken, the exact relationship between development and human rights remains unclear. However, the rapprochement between the two concepts and the need for development efforts regarding human rights have increased in recent years. On the other hand, the emergence of sustainable development as an acceptable method in development goals and policies makes this consensus even more unstable. The place of sustainable development in the legal debate on human rights and its role in promoting sustainable development programs require further research. Therefore, this article attempts to map the relationship between development and human rights, with particular emphasis on the place given to sustainable development principles in international human rights law. It will continue to investigate whether it recognizes sustainable development rights. The article will therefore give a positive answer to question mentioned here. The jurisprudence and interpretive guidelines of human rights institutions travel to confirm this hypothesis.Keywords: sustainable development, human rights, the right to development, the human rights-based approach to development, environmental rights, economic development, social sustainability human rights protection, human rights violations, workers’ rights, justice, security
Procedia PDF Downloads 5712340 An Approach to the Assembly Line Balancing Problem with Uncertain Operation Time
Authors: Zhongmin Wang, Lin Wei, Hengshan Zhang, Tianhua Chen, Yimin Zhou
Abstract:
The assembly line balancing problems are signficant in mass production systems. In order to deal with the uncertainties that practically exist but barely mentioned in the literature, this paper develops a mathematic model with an optimisation algorithm to solve the assembly line balancing problem with uncertainty operation time. The developed model is able to work with a variable number of workstations under the uncertain environment, aiming to obtain the minimal number of workstation and minimal idle time for each workstation. In particular, the proposed approach first introduces the concept of protection time that closely works with the uncertain operation time. Four dominance rules and the mechanism of determining up and low bounds are subsequently put forward, which serve as the basis for the proposed branch and bound algorithm. Experimental results show that the proposed work verified on a benchmark data set is able to solve the uncertainties efficiently.Keywords: assembly lines, SALBP-UOT, uncertain operation time, branch and bound algorithm.
Procedia PDF Downloads 17312339 A Comprehensive Metamodel of an Urbanized Information System: Experimental Case
Authors: Leila Trabelsi
Abstract:
The urbanization of Information Systems (IS) is an effective approach to master the complexity of the organization. It strengthens the coherence of IS and aligns it with the business strategy. Moreover, this approach has significant advantages such as reducing Information Technologies (IT) costs, enhancing the IS position in a competitive environment and ensuring the scalability of the IS through the integration of technological innovations. Therefore, the urbanization is considered as a business strategic decision. Thus, its embedding becomes a necessity in order to improve the IS practice. However, there is a lack of experimental cases studying meta-modelling of Urbanized Information System (UIS). The aim of this paper addresses new urbanization content meta-model which permits modelling, testing and taking into consideration organizational aspects. This methodological framework is structured according to two main abstraction levels, a conceptual level and an operational level. For each of these levels, different models are proposed and presented. The proposed model for has been empirically tested on company. The findings of this paper present an experimental study of urbanization meta-model. The paper points out the significant relationships between dimensions and their evolution.Keywords: urbanization, information systems, enterprise architecture, meta-model
Procedia PDF Downloads 44112338 Accurate Mass Segmentation Using U-Net Deep Learning Architecture for Improved Cancer Detection
Authors: Ali Hamza
Abstract:
Accurate segmentation of breast ultrasound images is of paramount importance in enhancing the diagnostic capabilities of breast cancer detection. This study presents an approach utilizing the U-Net architecture for segmenting breast ultrasound images aimed at improving the accuracy and reliability of mass identification within the breast tissue. The proposed method encompasses a multi-stage process. Initially, preprocessing techniques are employed to refine image quality and diminish noise interference. Subsequently, the U-Net architecture, a deep learning convolutional neural network (CNN), is employed for pixel-wise segmentation of regions of interest corresponding to potential breast masses. The U-Net's distinctive architecture, characterized by a contracting and expansive pathway, enables accurate boundary delineation and detailed feature extraction. To evaluate the effectiveness of the proposed approach, an extensive dataset of breast ultrasound images is employed, encompassing diverse cases. Quantitative performance metrics such as the Dice coefficient, Jaccard index, sensitivity, specificity, and Hausdorff distance are employed to comprehensively assess the segmentation accuracy. Comparative analyses against traditional segmentation methods showcase the superiority of the U-Net architecture in capturing intricate details and accurately segmenting breast masses. The outcomes of this study emphasize the potential of the U-Net-based segmentation approach in bolstering breast ultrasound image analysis. The method's ability to reliably pinpoint mass boundaries holds promise for aiding radiologists in precise diagnosis and treatment planning. However, further validation and integration within clinical workflows are necessary to ascertain their practical clinical utility and facilitate seamless adoption by healthcare professionals. In conclusion, leveraging the U-Net architecture for breast ultrasound image segmentation showcases a robust framework that can significantly enhance diagnostic accuracy and advance the field of breast cancer detection. This approach represents a pivotal step towards empowering medical professionals with a more potent tool for early and accurate breast cancer diagnosis.Keywords: mage segmentation, U-Net, deep learning, breast cancer detection, diagnostic accuracy, mass identification, convolutional neural network
Procedia PDF Downloads 8612337 A System Framework for Dynamic Service Deployment in Container-Based Computing Platform
Authors: Shuen-Tai Wang, Yu-Ching Lin, Hsi-Ya Chang
Abstract:
Cloud computing and virtualization technology have brought an innovative way for people to develop and use software nowadays. However, conventional virtualization comes at the expense of performance loss for applications. Container-based virtualization could be an option as it potentially reduces overhead and minimizes performance decline of the service platform. In this paper, we introduce a system framework and present an implementation of resource broker for dynamic cloud service deployment on the container-based platform to facilitate the efficient execution and improve the utilization. We target the load-aware service deployment approach for task ranking scenario. This proposed effort can collaborate with resource management system to adaptively deploy services according to the different requests. In particular, our approach relies on composing service immediately onto appropriate container according to user’s requirement in order to conserve the waiting time. Our evaluation shows how efficient of the service deployment is and how to expand its applicability to support the variety of cloud service.Keywords: cloud computing, container-based virtualization, resource broker, service deployment
Procedia PDF Downloads 17612336 Estimating Groundwater Seepage Rates: Case Study at Zegveld, Netherlands
Authors: Wondmyibza Tsegaye Bayou, Johannes C. Nonner, Joost Heijkers
Abstract:
This study aimed to identify and estimate dynamic groundwater seepage rates using four comparative methods; the Darcian approach, the water balance approach, the tracer method, and modeling. The theoretical background to these methods is put together in this study. The methodology was applied to a case study area at Zegveld following the advice of the Water Board Stichtse Rijnlanden. Data collection has been from various offices and a field campaign in the winter of 2008/09. In this complex confining layer of the study area, the location of the phreatic groundwater table is at a shallow depth compared to the piezometric water level. Data were available for the model years 1989 to 2000 and winter 2008/09. The higher groundwater table shows predominately-downward seepage in the study area. Results of the study indicated that net recharge to the groundwater table (precipitation excess) and the ditch system are the principal sources for seepage across the complex confining layer. Especially in the summer season, the contribution from the ditches is significant. Water is supplied from River Meije through a pumping system to meet the ditches' water demand. The groundwater seepage rate was distributed unevenly throughout the study area at the nature reserve averaging 0.60 mm/day for the model years 1989 to 2000 and 0.70 mm/day for winter 2008/09. Due to data restrictions, the seepage rates were mainly determined based on the Darcian method. Furthermore, the water balance approach and the tracer methods are applied to compute the flow exchange within the ditch system. The site had various validated groundwater levels and vertical flow resistance data sources. The phreatic groundwater level map compared with TNO-DINO groundwater level data values overestimated the groundwater level depth by 28 cm. The hydraulic resistance values obtained based on the 3D geological map compared with the TNO-DINO data agreed with the model values before calibration. On the other hand, the calibrated model significantly underestimated the downward seepage in the area compared with the field-based computations following the Darcian approach.Keywords: groundwater seepage, phreatic water table, piezometric water level, nature reserve, Zegveld, The Netherlands
Procedia PDF Downloads 8912335 Applying Epistemology to Artificial Intelligence in the Social Arena: Exploring Fundamental Considerations
Authors: Gianni Jacucci
Abstract:
Epistemology traditionally finds its place within human research philosophies and methodologies. Artificial intelligence methods pose challenges, particularly given the unresolved relationship between AI and pivotal concepts in social arenas such as hermeneutics and accountability. We begin by examining the essential criteria governing scientific rigor in the human sciences. We revisit the three foundational philosophies underpinning qualitative research methods: empiricism, hermeneutics, and phenomenology. We elucidate the distinct attributes, merits, and vulnerabilities inherent in the methodologies they inspire. The integration of AI, e.g., deep learning algorithms, sparks an interest in evaluating these criteria against the diverse forms of AI architectures. For instance, Interpreted AI could be viewed as a hermeneutic approach, relying on a priori interpretations, while straight AI may be perceived as a descriptive phenomenological approach, processing original and uncontaminated data. This paper serves as groundwork for such explorations, offering preliminary reflections to lay the foundation and outline the initial landscape.Keywords: artificial intelligence, deep learning, epistemology, qualitative research, methodology, hermeneutics, accountability
Procedia PDF Downloads 4312334 Cloud Points to Create an Innovative and Custom Ankle Foot Orthosis in CAD Environment
Authors: Y. Benabid, K. Benfriha, V. Rieuf, J. F. Omhover
Abstract:
This paper describes an approach to create custom concepts for innovative products; this approach describes relations between innovation tools and Computer Aided Design environment (use creativity session and design tools). A model for the design process is proposed and explored in order to describe the power tool used to create and ameliorate an innovative product all based upon a range of data (cloud points) in this study. Comparison between traditional method and innovative method we help to generate and put forward a new model of the design process in order to create a custom Ankle Foot Orthosis (AFO) in a CAD environment in order to ameliorate and controlling the motion. The custom concept needs big development in different environments; the relation between these environments is described. The results can help the surgeons in the upstream treatment phases. CAD models can be applied and accepted by professionals in the design and manufacture systems. This development is based on the anatomy of the population of North Africa.Keywords: ankle foot orthosis, CAD, reverse engineering, sketch
Procedia PDF Downloads 45812333 Human Rights and Counter-Terrorism in Nigeria: A Systematic Review
Authors: Tarela J. Ike
Abstract:
Over the years, the hemorrhagic acts of Boko Haram have led to the adoption of counter-terrorism measures which mostly takes the form of military repressive measures. These measures have wrought flagrant violation of human rights worthy of concern. Hence, the need to examine the efficacy of the counter-terrorism measures adopted by the Nigeria government in combatting terrorism. This article addresses this issue by relying on a systematic literature review which examines the impact of Nigeria counter-terrorism measures from 2009 to 2016 in combating terrorism. The review of literature includes 42 article. Of the 42 articles, 14 met the peer-reviewed requirement which finds that most of Nigeria’s counter-terrorism policies are geared toward the use of state repressive military approach which violates the human right. Thus, the study concludes that to effectively address the terrorist uprising; Nigeria should adopt a non-aggressive counter-terrorism approach which incorporates religious clerics, and community active engagement strategy in combatting terrorism as opposed to military retaliation which violates human right and so far proved ineffective.Keywords: Boko Haram, counter-terrorism, human rights, military retaliation
Procedia PDF Downloads 41612332 CFD Study of Free Surface Flows Resulting from a Dam-Breaking
Authors: Sonia Ben Hamza, Sabra Habli, Nejla Mahjoub Saïd, Hervé Bournot, Georges Le Palec
Abstract:
Free surface flows caused by dam breaks in channels or rivers is an attention-getting subject to the engineering practice, however, the studies are few to be reported. In this paper, a numerical investigation of unsteady free surface flows resulting from a dam-breaking in a rectangular channel is studied. Numerical computations were carried out using ANSYS Fluent which is based on the finite volume approach. The air/water interface was modeled with the volume of fluid method (VOF). Verification for a typical dam-break problem is analyzed by comparing the present results with others and very good agreement is obtained. The present approach is then used to predict the characteristics of free surface flow due to the dam breaking in channel. The characteristics of complex unsteady free surface flow in these examples are clearly explained. The numerical results show that the flow became more disturbed after impacting the vertical wall, then a recirculation zone, as well as turbulence phenomena, were created. At this instant, a cavity of air was included on the flow. The results agree well with the experimental data found in the literature.Keywords: CFD, dam-break, free surface, turbulent flows, VOF
Procedia PDF Downloads 31012331 Roadmaps as a Tool of Innovation Management: System View
Authors: Matich Lyubov
Abstract:
Today roadmaps are becoming commonly used tools for detecting and designing a desired future for companies, states and the international community. The growing popularity of this method puts tasks such as identifying basic roadmapping principles, creation of concepts and determination of the characteristics of the use of roadmaps depending on the objectives as well as restrictions and opportunities specific to the study area on the agenda. However, the system approach, e.g. the elements which are recognized to be major for high-quality roadmapping, remains one of the main fields for improving the methodology and practice of their development as limited research was devoted to the detailed analysis of the roadmaps from the view of system approach. Therefore, this article is an attempt to examine roadmaps from the view of the system analysis, to compare areas, where, as a rule, roadmaps and systems analysis are considered the most effective tools. To compare the structure and composition of roadmaps and systems models the identification of common points between construction stages of roadmaps and system modeling and the determination of future directions for research roadmaps from a systems perspective are of special importance.Keywords: technology roadmap, roadmapping, systems analysis, system modeling, innovation management
Procedia PDF Downloads 31212330 Service Interactions Coordination Using a Declarative Approach: Focuses on Deontic Rule from Semantics of Business Vocabulary and Rules Models
Authors: Nurulhuda A. Manaf, Nor Najihah Zainal Abidin, Nur Amalina Jamaludin
Abstract:
Coordinating service interactions are a vital part of developing distributed applications that are built up as networks of autonomous participants, e.g., software components, web services, online resources, involve a collaboration between a diverse number of participant services on different providers. The complexity in coordinating service interactions reflects how important the techniques and approaches require for designing and coordinating the interaction between participant services to ensure the overall goal of a collaboration between participant services is achieved. The objective of this research is to develop capability of steering a complex service interaction towards a desired outcome. Therefore, an efficient technique for modelling, generating, and verifying the coordination of service interactions is developed. The developed model describes service interactions using service choreographies approach and focusing on a declarative approach, advocating an Object Management Group (OMG) standard, Semantics of Business Vocabulary and Rules (SBVR). This model, namely, SBVR model for service choreographies focuses on a declarative deontic rule expressing both obligation and prohibition, which can be more useful in working with coordinating service interactions. The generated SBVR model is then be formulated and be transformed into Alloy model using Alloy Analyzer for verifying the generated SBVR model. The transformation of SBVR into Alloy allows to automatically generate the corresponding coordination of service interactions (service choreography), hence producing an immediate instance of execution that satisfies the constraints of the specification and verifies whether a specific request can be realised in the given choreography in the generated choreography.Keywords: service choreography, service coordination, behavioural modelling, complex interactions, declarative specification, verification, model transformation, semantics of business vocabulary and rules, SBVR
Procedia PDF Downloads 15612329 Component Based Testing Using Clustering and Support Vector Machine
Authors: Iqbaldeep Kaur, Amarjeet Kaur
Abstract:
Software Reusability is important part of software development. So component based software development in case of software testing has gained a lot of practical importance in the field of software engineering from academic researcher and also from software development industry perspective. Finding test cases for efficient reuse of test cases is one of the important problems aimed by researcher. Clustering reduce the search space, reuse test cases by grouping similar entities according to requirements ensuring reduced time complexity as it reduce the search time for retrieval the test cases. In this research paper we proposed approach for re-usability of test cases by unsupervised approach. In unsupervised learning we proposed k-mean and Support Vector Machine. We have designed the algorithm for requirement and test case document clustering according to its tf-idf vector space and the output is set of highly cohesive pattern groups.Keywords: software testing, reusability, clustering, k-mean, SVM
Procedia PDF Downloads 43212328 Surface Thermodynamics Approach to Mycobacterium tuberculosis (M-TB) – Human Sputum Interactions
Authors: J. L. Chukwuneke, C. H. Achebe, S. N. Omenyi
Abstract:
This research work presents the surface thermodynamics approach to M-TB/HIV-Human sputum interactions. This involved the use of the Hamaker coefficient concept as a surface energetics tool in determining the interaction processes, with the surface interfacial energies explained using van der Waals concept of particle interactions. The Lifshitz derivation for van der Waals forces was applied as an alternative to the contact angle approach which has been widely used in other biological systems. The methodology involved taking sputum samples from twenty infected persons and from twenty uninfected persons for absorbance measurement using a digital Ultraviolet visible Spectrophotometer. The variables required for the computations with the Lifshitz formula were derived from the absorbance data. The Matlab software tools were used in the mathematical analysis of the data produced from the experiments (absorbance values). The Hamaker constants and the combined Hamaker coefficients were obtained using the values of the dielectric constant together with the Lifshitz equation. The absolute combined Hamaker coefficients A132abs and A131abs on both infected and uninfected sputum samples gave the values of A132abs = 0.21631x10-21Joule for M-TB infected sputum and Ã132abs = 0.18825x10-21Joule for M-TB/HIV infected sputum. The significance of this result is the positive value of the absolute combined Hamaker coefficient which suggests the existence of net positive van der waals forces demonstrating an attraction between the bacteria and the macrophage. This however, implies that infection can occur. It was also shown that in the presence of HIV, the interaction energy is reduced by 13% conforming adverse effects observed in HIV patients suffering from tuberculosis.Keywords: absorbance, dielectric constant, hamaker coefficient, lifshitz formula, macrophage, mycobacterium tuberculosis, van der waals forces
Procedia PDF Downloads 28012327 Building Transparent Supply Chains through Digital Tracing
Authors: Penina Orenstein
Abstract:
In today’s world, particularly with COVID-19 a constant worldwide threat, organizations need greater visibility over their supply chains more than ever before, in order to find areas for improvement and greater efficiency, reduce the chances of disruption and stay competitive. The concept of supply chain mapping is one where every process and route is mapped in detail between each vendor and supplier. The simplest method of mapping involves sourcing publicly available data including news and financial information concerning relationships between suppliers. An additional layer of information would be disclosed by large, direct suppliers about their production and logistics sites. While this method has the advantage of not requiring any input from suppliers, it also doesn’t allow for much transparency beyond the first supplier tier and may generate irrelevant data—noise—that must be filtered out to find the actionable data. The primary goal of this research is to build data maps of supply chains by focusing on a layered approach. Using these maps, the secondary goal is to address the question as to whether the supply chain is re-engineered to make improvements, for example, to lower the carbon footprint. Using a drill-down approach, the end result is a comprehensive map detailing the linkages between tier-one, tier-two, and tier-three suppliers super-imposed on a geographical map. The driving force behind this idea is to be able to trace individual parts to the exact site where they’re manufactured. In this way, companies can ensure sustainability practices from the production of raw materials through the finished goods. The approach allows companies to identify and anticipate vulnerabilities in their supply chain. It unlocks predictive analytics capabilities and enables them to act proactively. The research is particularly compelling because it unites network science theory with empirical data and presents the results in a visual, intuitive manner.Keywords: data mining, supply chain, empirical research, data mapping
Procedia PDF Downloads 17812326 A Positive Neuroscience Perspective for Child Development and Special Education
Authors: Amedeo D'Angiulli, Kylie Schibli
Abstract:
Traditionally, children’s brain development research has emphasized the limitative aspects of disability and impairment, electing as an explanatory model the classical clinical notions of brain lesion or functional deficit. In contrast, Positive Educational Neuroscience (PEN) is a new approach that emphasizes strengths and human flourishing related to the brain by exploring how learning practices have the potential to enhance neurocognitive flexibility through neuroplastic overcompensation. This mini-review provides an overview of PEN and shows how it links to the concept of neurocognitive flexibility. We provide examples of how the present concept of neurocognitive flexibility can be applied to special education by exploring examples of neuroplasticity in the learning domain, including: (1) learning to draw in congenitally totally blind children, and (2) music training in children from disadvantaged neighborhoods. PEN encourages educators to focus on children’s strengths by recognizing the brain’s capacity for positive change and to incorporate activities that support children’s individual development.Keywords: neurocognitive development, positive educational neuroscience, sociocultural approach, special education
Procedia PDF Downloads 24412325 Emerging Therapeutic Approach with Dandelion Phytochemicals in Breast Cancer Treatment
Authors: Angel Champion, Sadia Kanwal, Rafat Siddiqui
Abstract:
Harnessing phytochemicals from plant sources presents a novel opportunity to prevent or treat malignant diseases, including breast cancer. Chemotherapy lacks precision in targeting cancerous cells while sparing normal cells, but a phytopharmaceutical approach may offer a solution. Dandelion, a common weed plant, is rich in phytochemicals and provides a safer, more cost-effective alternative with lower toxicity than traditional pharmaceuticals for conditions such as breast cancer. In this study, an in-vitro experiment will be conducted using the ethanol extract of Dandelion on triple-negative MDA-231 breast cancer cell lines. The polyphenolic analysis revealed that the Dandelion extract, particularly from the root and leaf (both cut and sifted), had the most potent antioxidant properties and exhibited the most potent antioxidation activity from the powdered leaf extract. The extract exhibits prospective promising effects for inducing cell proliferation and apoptosis in breast cancer cells, highlighting its potential for targeted therapeutic interventions. Standardizing methods for Dandelion use is crucial for future clinical applications in cancer treatment. Combining plant-derived compounds with cancer nanotechnology holds the potential for effective strategies in battling malignant diseases. Utilizing liposomes as carriers for phytoconstituent anti-cancer agents offers improved solubility, bioavailability, immunoregulatory effects, advancing anticancer immune function, and reducing toxicity. This integrated approach of natural products and nanotechnology has significant potential to revolutionize healthcare globally, especially in underserved communities where herbal medicine is prevalent.Keywords: apoptosis, antioxidant activity, cancer nanotechnology, phytopharmaceutical
Procedia PDF Downloads 5712324 Sensitivity Analysis and Solitary Wave Solutions to the (2+1)-Dimensional Boussinesq Equation in Dispersive Media
Authors: Naila Nasreen, Dianchen Lu
Abstract:
This paper explores the dynamical behavior of the (2+1)-dimensional Boussinesq equation, which is a nonlinear water wave equation and is used to model wave packets in dispersive media with weak nonlinearity. This equation depicts how long wave made in shallow water propagates due to the influence of gravity. The (2+1)- dimensional Boussinesq equation combines the two-way propagation of the classical Boussinesq equation with the dependence on a second spatial variable, as that occurs in the two-dimensional Kadomstev- Petviashvili equation. This equation provides a description of head- on collision of oblique waves and it possesses some interesting properties. The governing model is discussed by the assistance of Ricatti equation mapping method, a relatively integration tool. The solutions have been extracted in different forms the solitary wave solutions as well as hyperbolic and periodic solutions. Moreover, the sensitivity analysis is demonstrated for the designed dynamical structural system’s wave profiles, where the soliton wave velocity and wave number parameters regulate the water wave singularity. In addition to being helpful for elucidating nonlinear partial differential equations, the method in use gives previously extracted solutions and extracts fresh exact solutions. Assuming the right values for the parameters, various graph in different shapes are sketched to provide information about the visual format of the earned results. This paper’s findings support the efficacy of the approach taken in enhancing nonlinear dynamical behavior. We believe this research will be of interest to a wide variety of engineers that work with engineering models. Findings show the effectiveness simplicity, and generalizability of the chosen computational approach, even when applied to complicated systems in a variety of fields, especially in ocean engineering.Keywords: (2+1)-dimensional Boussinesq equation, solitary wave solutions, Ricatti equation mapping approach, nonlinear phenomena
Procedia PDF Downloads 10312323 Degraded Document Analysis and Extraction of Original Text Document: An Approach without Optical Character Recognition
Authors: L. Hamsaveni, Navya Prakash, Suresha
Abstract:
Document Image Analysis recognizes text and graphics in documents acquired as images. An approach without Optical Character Recognition (OCR) for degraded document image analysis has been adopted in this paper. The technique involves document imaging methods such as Image Fusing and Speeded Up Robust Features (SURF) Detection to identify and extract the degraded regions from a set of document images to obtain an original document with complete information. In case, degraded document image captured is skewed, it has to be straightened (deskew) to perform further process. A special format of image storing known as YCbCr is used as a tool to convert the Grayscale image to RGB image format. The presented algorithm is tested on various types of degraded documents such as printed documents, handwritten documents, old script documents and handwritten image sketches in documents. The purpose of this research is to obtain an original document for a given set of degraded documents of the same source.Keywords: grayscale image format, image fusing, RGB image format, SURF detection, YCbCr image format
Procedia PDF Downloads 37712322 Automatic Classification of Periodic Heart Sounds Using Convolutional Neural Network
Authors: Jia Xin Low, Keng Wah Choo
Abstract:
This paper presents an automatic normal and abnormal heart sound classification model developed based on deep learning algorithm. MITHSDB heart sounds datasets obtained from the 2016 PhysioNet/Computing in Cardiology Challenge database were used in this research with the assumption that the electrocardiograms (ECG) were recorded simultaneously with the heart sounds (phonocardiogram, PCG). The PCG time series are segmented per heart beat, and each sub-segment is converted to form a square intensity matrix, and classified using convolutional neural network (CNN) models. This approach removes the need to provide classification features for the supervised machine learning algorithm. Instead, the features are determined automatically through training, from the time series provided. The result proves that the prediction model is able to provide reasonable and comparable classification accuracy despite simple implementation. This approach can be used for real-time classification of heart sounds in Internet of Medical Things (IoMT), e.g. remote monitoring applications of PCG signal.Keywords: convolutional neural network, discrete wavelet transform, deep learning, heart sound classification
Procedia PDF Downloads 35112321 An Analysis on Clustering Based Gene Selection and Classification for Gene Expression Data
Authors: K. Sathishkumar, V. Thiagarasu
Abstract:
Due to recent advances in DNA microarray technology, it is now feasible to obtain gene expression profiles of tissue samples at relatively low costs. Many scientists around the world use the advantage of this gene profiling to characterize complex biological circumstances and diseases. Microarray techniques that are used in genome-wide gene expression and genome mutation analysis help scientists and physicians in understanding of the pathophysiological mechanisms, in diagnoses and prognoses, and choosing treatment plans. DNA microarray technology has now made it possible to simultaneously monitor the expression levels of thousands of genes during important biological processes and across collections of related samples. Elucidating the patterns hidden in gene expression data offers a tremendous opportunity for an enhanced understanding of functional genomics. However, the large number of genes and the complexity of biological networks greatly increase the challenges of comprehending and interpreting the resulting mass of data, which often consists of millions of measurements. A first step toward addressing this challenge is the use of clustering techniques, which is essential in the data mining process to reveal natural structures and identify interesting patterns in the underlying data. This work presents an analysis of several clustering algorithms proposed to deals with the gene expression data effectively. The existing clustering algorithms like Support Vector Machine (SVM), K-means algorithm and evolutionary algorithm etc. are analyzed thoroughly to identify the advantages and limitations. The performance evaluation of the existing algorithms is carried out to determine the best approach. In order to improve the classification performance of the best approach in terms of Accuracy, Convergence Behavior and processing time, a hybrid clustering based optimization approach has been proposed.Keywords: microarray technology, gene expression data, clustering, gene Selection
Procedia PDF Downloads 32512320 A General Iterative Nonlinear Programming Method to Synthesize Heat Exchanger Network
Authors: Rupu Yang, Cong Toan Tran, Assaad Zoughaib
Abstract:
The work provides an iterative nonlinear programming method to synthesize a heat exchanger network by manipulating the trade-offs between the heat load of process heat exchangers (HEs) and utilities. We consider for the synthesis problem two cases, the first one without fixed cost for HEs, and the second one with fixed cost. For the no fixed cost problem, the nonlinear programming (NLP) model with all the potential HEs is optimized to obtain the global optimum. For the case with fixed cost, the NLP model is iterated through adding/removing HEs. The method was applied in five case studies and illustrated quite well effectiveness. Among which, the approach reaches the lowest TAC (2,904,026$/year) compared with the best record for the famous Aromatic plants problem. It also locates a slightly better design than records in literature for a 10 streams case without fixed cost with only 1/9 computational time. Moreover, compared to the traditional mixed-integer nonlinear programming approach, the iterative NLP method opens a possibility to consider constraints (such as controllability or dynamic performances) that require knowing the structure of the network to be calculated.Keywords: heat exchanger network, synthesis, NLP, optimization
Procedia PDF Downloads 16712319 Preparation of Papers - Developing a Leukemia Diagnostic System Based on Hybrid Deep Learning Architectures in Actual Clinical Environments
Authors: Skyler Kim
Abstract:
An early diagnosis of leukemia has always been a challenge to doctors and hematologists. On a worldwide basis, it was reported that there were approximately 350,000 new cases in 2012, and diagnosing leukemia was time-consuming and inefficient because of an endemic shortage of flow cytometry equipment in current clinical practice. As the number of medical diagnosis tools increased and a large volume of high-quality data was produced, there was an urgent need for more advanced data analysis methods. One of these methods was the AI approach. This approach has become a major trend in recent years, and several research groups have been working on developing these diagnostic models. However, designing and implementing a leukemia diagnostic system in real clinical environments based on a deep learning approach with larger sets remains complex. Leukemia is a major hematological malignancy that results in mortality and morbidity throughout different ages. We decided to select acute lymphocytic leukemia to develop our diagnostic system since acute lymphocytic leukemia is the most common type of leukemia, accounting for 74% of all children diagnosed with leukemia. The results from this development work can be applied to all other types of leukemia. To develop our model, the Kaggle dataset was used, which consists of 15135 total images, 8491 of these are images of abnormal cells, and 5398 images are normal. In this paper, we design and implement a leukemia diagnostic system in a real clinical environment based on deep learning approaches with larger sets. The proposed diagnostic system has the function of detecting and classifying leukemia. Different from other AI approaches, we explore hybrid architectures to improve the current performance. First, we developed two independent convolutional neural network models: VGG19 and ResNet50. Then, using both VGG19 and ResNet50, we developed a hybrid deep learning architecture employing transfer learning techniques to extract features from each input image. In our approach, fusing the features from specific abstraction layers can be deemed as auxiliary features and lead to further improvement of the classification accuracy. In this approach, features extracted from the lower levels are combined into higher dimension feature maps to help improve the discriminative capability of intermediate features and also overcome the problem of network gradient vanishing or exploding. By comparing VGG19 and ResNet50 and the proposed hybrid model, we concluded that the hybrid model had a significant advantage in accuracy. The detailed results of each model’s performance and their pros and cons will be presented in the conference.Keywords: acute lymphoblastic leukemia, hybrid model, leukemia diagnostic system, machine learning
Procedia PDF Downloads 18812318 Sentiment Analysis: An Enhancement of Ontological-Based Features Extraction Techniques and Word Equations
Authors: Mohd Ridzwan Yaakub, Muhammad Iqbal Abu Latiffi
Abstract:
Online business has become popular recently due to the massive amount of information and medium available on the Internet. This has resulted in the huge number of reviews where the consumers share their opinion, criticisms, and satisfaction on the products they have purchased on the websites or the social media such as Facebook and Twitter. However, to analyze customer’s behavior has become very important for organizations to find new market trends and insights. The reviews from the websites or the social media are in structured and unstructured data that need a sentiment analysis approach in analyzing customer’s review. In this article, techniques used in will be defined. Definition of the ontology and description of its possible usage in sentiment analysis will be defined. It will lead to empirical research that related to mobile phones used in research and the ontology used in the experiment. The researcher also will explore the role of preprocessing data and feature selection methodology. As the result, ontology-based approach in sentiment analysis can help in achieving high accuracy for the classification task.Keywords: feature selection, ontology, opinion, preprocessing data, sentiment analysis
Procedia PDF Downloads 20112317 Global Pandemic of Chronic Diseases: Public Health Challenges to Reduce the Development
Authors: Benjamin Poku
Abstract:
Purpose: The purpose of the research is to conduct systematic reviews and synthesis of existing knowledge that addresses the growing incidence and prevalence of chronic diseases across the world and its impact on public health in relation to communicable diseases. Principal results: A careful compilation and summary of 15-20 peer-reviewed publications from reputable databases such as PubMed, MEDLINE, CINAHL, and other peer-reviewed journals indicate that the Global pandemic of Chronic diseases (such as diabetes, high blood pressure, etc.) have become a greater public health burden in proportion as compared to communicable diseases. Significant conclusions: Given the complexity of the situation, efforts and strategies to mitigate the negative effect of the Global Pandemic on chronic diseases within the global community must include not only urgent and binding commitment of all stakeholders but also a multi-sectorial long-term approach to increase the public health educational approach to meet the increasing world population of over 8 billion people and also the aging population as well to meet the complex challenges of chronic diseases.Keywords: pandemic, chronic disease, public health, health challenges
Procedia PDF Downloads 52912316 Forest Fire Burnt Area Assessment in a Part of West Himalayan Region Using Differenced Normalized Burnt Ratio and Neural Network Approach
Authors: Sunil Chandra, Himanshu Rawat, Vikas Gusain, Triparna Barman
Abstract:
Forest fires are a recurrent phenomenon in the Himalayan region owing to the presence of vulnerable forest types, topographical gradients, climatic weather conditions, and anthropogenic pressure. The present study focuses on the identification of forest fire-affected areas in a small part of the West Himalayan region using a differential normalized burnt ratio method and spectral unmixing methods. The study area has a rugged terrain with the presence of sub-tropical pine forest, montane temperate forest, and sub-alpine forest and scrub. The major reason for fires in this region is anthropogenic in nature, with the practice of human-induced fires for getting fresh leaves, scaring wild animals to protect agricultural crops, grazing practices within reserved forests, and igniting fires for cooking and other reasons. The fires caused by the above reasons affect a large area on the ground, necessitating its precise estimation for further management and policy making. In the present study, two approaches have been used for carrying out a burnt area analysis. The first approach followed for burnt area analysis uses a differenced normalized burnt ratio (dNBR) index approach that uses the burnt ratio values generated using the Short-Wave Infrared (SWIR) band and Near Infrared (NIR) bands of the Sentinel-2 image. The results of the dNBR have been compared with the outputs of the spectral mixing methods. It has been found that the dNBR is able to create good results in fire-affected areas having homogenous forest stratum and with slope degree <5 degrees. However, in a rugged terrain where the landscape is largely influenced by the topographical variations, vegetation types, tree density, the results may be largely influenced by the effects of topography, complexity in tree composition, fuel load composition, and soil moisture. Hence, such variations in the factors influencing burnt area assessment may not be effectively carried out using a dNBR approach which is commonly followed for burnt area assessment over a large area. Hence, another approach that has been attempted in the present study utilizes a spectral mixing method where the individual pixel is tested before assigning an information class to it. The method uses a neural network approach utilizing Sentinel-2 bands. The training and testing data are generated from the Sentinel-2 data and the national field inventory, which is further used for generating outputs using ML tools. The analysis of the results indicates that the fire-affected regions and their severity can be better estimated using spectral unmixing methods, which have the capability to resolve the noise in the data and can classify the individual pixel to the precise burnt/unburnt class.Keywords: categorical data, log linear modeling, neural network, shifting cultivation
Procedia PDF Downloads 5712315 A Low-Cost Experimental Approach for Teaching Energy Quantization: Determining the Planck Constant with Arduino and Led
Authors: Gastão Soares Ximenes de Oliveira, Richar Nicolás Durán, Romeo Micah Szmoski, Eloiza Aparecida Avila de Matos, Elano Gustavo Rein
Abstract:
This article aims to present an experimental method to determine Planck's constant by calculating the cutting potential V₀ from LEDs with different wavelengths. The experiment is designed using Arduino as a central tool in order to make the experimental activity more engaging and attractive for students with the use of digital technologies. From the characteristic curves of each LED, graphical analysis was used to obtain the cutting potential, and knowing the corresponding wavelength, it was possible to calculate Planck's constant. This constant was also obtained from the linear adjustment of the cutting potential graph by the frequency of each LED. Given the relevance of Planck's constant in physics, it is believed that this experiment can offer teachers the opportunity to approach concepts from modern physics, such as the quantization of energy, in a more accessible and applied way in the classroom. This will not only enrich students' understanding of the fundamental nature of matter but also encourage deeper engagement with the principles of quantum physics.Keywords: physics teaching, educational technology, modern physics, Planck constant, Arduino
Procedia PDF Downloads 7912314 Registration of Multi-Temporal Unmanned Aerial Vehicle Images for Facility Monitoring
Authors: Dongyeob Han, Jungwon Huh, Quang Huy Tran, Choonghyun Kang
Abstract:
Unmanned Aerial Vehicles (UAVs) have been used for surveillance, monitoring, inspection, and mapping. In this paper, we present a systematic approach for automatic registration of UAV images for monitoring facilities such as building, green house, and civil structures. The two-step process is applied; 1) an image matching technique based on SURF (Speeded up Robust Feature) and RANSAC (Random Sample Consensus), 2) bundle adjustment of multi-temporal images. Image matching to find corresponding points is one of the most important steps for the precise registration of multi-temporal images. We used the SURF algorithm to find a quick and effective matching points. RANSAC algorithm was used in the process of finding matching points between images and in the bundle adjustment process. Experimental results from UAV images showed that our approach has a good accuracy to be applied to the change detection of facility.Keywords: building, image matching, temperature, unmanned aerial vehicle
Procedia PDF Downloads 294