Search results for: 3D numerical approach
12990 Applying Epistemology to Artificial Intelligence in the Social Arena: Exploring Fundamental Considerations
Authors: Gianni Jacucci
Abstract:
Epistemology traditionally finds its place within human research philosophies and methodologies. Artificial intelligence methods pose challenges, particularly given the unresolved relationship between AI and pivotal concepts in social arenas such as hermeneutics and accountability. We begin by examining the essential criteria governing scientific rigor in the human sciences. We revisit the three foundational philosophies underpinning qualitative research methods: empiricism, hermeneutics, and phenomenology. We elucidate the distinct attributes, merits, and vulnerabilities inherent in the methodologies they inspire. The integration of AI, e.g., deep learning algorithms, sparks an interest in evaluating these criteria against the diverse forms of AI architectures. For instance, Interpreted AI could be viewed as a hermeneutic approach, relying on a priori interpretations, while straight AI may be perceived as a descriptive phenomenological approach, processing original and uncontaminated data. This paper serves as groundwork for such explorations, offering preliminary reflections to lay the foundation and outline the initial landscape.Keywords: artificial intelligence, deep learning, epistemology, qualitative research, methodology, hermeneutics, accountability
Procedia PDF Downloads 4012989 Fracture Mechanics Modeling of a Shear-Cracked RC Beams Shear-Strengthened with FRP Sheets
Authors: Shahriar Shahbazpanahi, Alaleh Kamgar
Abstract:
So far, the conventional experimental and theoretical analysis in fracture mechanics have been applied to study concrete flexural- cracked beams, which are strengthened using fiber reinforced polymer (FRP) composite sheets. However, there is still little knowledge about the shear capacity of a side face FRP- strengthened shear-cracked beam. A numerical analysis is herein presented to model the fracture mechanics of a four-point RC beam, with two inclined initial notch on the supports, which is strengthened with side face FRP sheets. In the present study, the shear crack is forced to conduct by using an initial notch in supports. The ABAQUS software is used to model crack propagation by conventional cohesive elements. It is observed that the FRP sheets play important roles in preventing the propagation of shear cracks.Keywords: crack, FRP, shear, strengthening
Procedia PDF Downloads 55012988 Cloud Points to Create an Innovative and Custom Ankle Foot Orthosis in CAD Environment
Authors: Y. Benabid, K. Benfriha, V. Rieuf, J. F. Omhover
Abstract:
This paper describes an approach to create custom concepts for innovative products; this approach describes relations between innovation tools and Computer Aided Design environment (use creativity session and design tools). A model for the design process is proposed and explored in order to describe the power tool used to create and ameliorate an innovative product all based upon a range of data (cloud points) in this study. Comparison between traditional method and innovative method we help to generate and put forward a new model of the design process in order to create a custom Ankle Foot Orthosis (AFO) in a CAD environment in order to ameliorate and controlling the motion. The custom concept needs big development in different environments; the relation between these environments is described. The results can help the surgeons in the upstream treatment phases. CAD models can be applied and accepted by professionals in the design and manufacture systems. This development is based on the anatomy of the population of North Africa.Keywords: ankle foot orthosis, CAD, reverse engineering, sketch
Procedia PDF Downloads 45512987 Human Rights and Counter-Terrorism in Nigeria: A Systematic Review
Authors: Tarela J. Ike
Abstract:
Over the years, the hemorrhagic acts of Boko Haram have led to the adoption of counter-terrorism measures which mostly takes the form of military repressive measures. These measures have wrought flagrant violation of human rights worthy of concern. Hence, the need to examine the efficacy of the counter-terrorism measures adopted by the Nigeria government in combatting terrorism. This article addresses this issue by relying on a systematic literature review which examines the impact of Nigeria counter-terrorism measures from 2009 to 2016 in combating terrorism. The review of literature includes 42 article. Of the 42 articles, 14 met the peer-reviewed requirement which finds that most of Nigeria’s counter-terrorism policies are geared toward the use of state repressive military approach which violates the human right. Thus, the study concludes that to effectively address the terrorist uprising; Nigeria should adopt a non-aggressive counter-terrorism approach which incorporates religious clerics, and community active engagement strategy in combatting terrorism as opposed to military retaliation which violates human right and so far proved ineffective.Keywords: Boko Haram, counter-terrorism, human rights, military retaliation
Procedia PDF Downloads 41312986 Roadmaps as a Tool of Innovation Management: System View
Authors: Matich Lyubov
Abstract:
Today roadmaps are becoming commonly used tools for detecting and designing a desired future for companies, states and the international community. The growing popularity of this method puts tasks such as identifying basic roadmapping principles, creation of concepts and determination of the characteristics of the use of roadmaps depending on the objectives as well as restrictions and opportunities specific to the study area on the agenda. However, the system approach, e.g. the elements which are recognized to be major for high-quality roadmapping, remains one of the main fields for improving the methodology and practice of their development as limited research was devoted to the detailed analysis of the roadmaps from the view of system approach. Therefore, this article is an attempt to examine roadmaps from the view of the system analysis, to compare areas, where, as a rule, roadmaps and systems analysis are considered the most effective tools. To compare the structure and composition of roadmaps and systems models the identification of common points between construction stages of roadmaps and system modeling and the determination of future directions for research roadmaps from a systems perspective are of special importance.Keywords: technology roadmap, roadmapping, systems analysis, system modeling, innovation management
Procedia PDF Downloads 31112985 Leading Edge Vortex Development for a 65° Delta Wing with Varying Thickness and Maximum Thickness Locations
Authors: Jana Stucke, Sean Tuling, Chris Toomer
Abstract:
This study focuses on the numerical investigation of the leading edge vortex (LEV) development over a 65° swept delta wing with varying thickness and maximum thickness location and their impact on its overall performance. The tested configurations are defined by a 6% and 12 % thick biconvex aerofoil with maximum thickness location at 30% and 50% of the root chord. The results are compared to a flat plate delta wing configuration of 3.4% thickness. The largest differences are observed for the aerofoils of 12% thickness and are used to demonstrate the trends and aerodynamic characteristics from here on. It was found that the vortex structure changes with change with maximum thickness and overall thickness. This change leads to not only a reduction in lift but also in drag, especially when the maximum thickness is moved forward. The reduction in drag, however, outweighs the loss in lift thus increasing the overall performance of the configuration.Keywords: aerodynamics, CFD, delta wing, leading edge vortices
Procedia PDF Downloads 23012984 Service Interactions Coordination Using a Declarative Approach: Focuses on Deontic Rule from Semantics of Business Vocabulary and Rules Models
Authors: Nurulhuda A. Manaf, Nor Najihah Zainal Abidin, Nur Amalina Jamaludin
Abstract:
Coordinating service interactions are a vital part of developing distributed applications that are built up as networks of autonomous participants, e.g., software components, web services, online resources, involve a collaboration between a diverse number of participant services on different providers. The complexity in coordinating service interactions reflects how important the techniques and approaches require for designing and coordinating the interaction between participant services to ensure the overall goal of a collaboration between participant services is achieved. The objective of this research is to develop capability of steering a complex service interaction towards a desired outcome. Therefore, an efficient technique for modelling, generating, and verifying the coordination of service interactions is developed. The developed model describes service interactions using service choreographies approach and focusing on a declarative approach, advocating an Object Management Group (OMG) standard, Semantics of Business Vocabulary and Rules (SBVR). This model, namely, SBVR model for service choreographies focuses on a declarative deontic rule expressing both obligation and prohibition, which can be more useful in working with coordinating service interactions. The generated SBVR model is then be formulated and be transformed into Alloy model using Alloy Analyzer for verifying the generated SBVR model. The transformation of SBVR into Alloy allows to automatically generate the corresponding coordination of service interactions (service choreography), hence producing an immediate instance of execution that satisfies the constraints of the specification and verifies whether a specific request can be realised in the given choreography in the generated choreography.Keywords: service choreography, service coordination, behavioural modelling, complex interactions, declarative specification, verification, model transformation, semantics of business vocabulary and rules, SBVR
Procedia PDF Downloads 15512983 Static Response of Homogeneous Clay Stratum to Imposed Structural Loads
Authors: Aaron Aboshio
Abstract:
Numerical study of the static response of homogeneous clay stratum considering a wide range of cohesion and subject to foundation loads is presented. The linear elastic–perfectly plastic constitutive relation with the von Mises yield criterion were utilised to develop a numerically cost effective finite element model for the soil while imposing a rigid body constrain to the foundation footing. From the analyses carried out, estimate of the bearing capacity factor, Nc as well as the ultimate load-carrying capacities of these soils, effect of cohesion on foundation settlements, stress fields and failure propagation were obtained. These are consistent with other findings in the literature and hence can be a useful guide in design of safe foundations in clay soils for buildings and other structure.Keywords: bearing capacity factors, finite element method, safe bearing pressure, structure-soil interaction
Procedia PDF Downloads 30212982 Component Based Testing Using Clustering and Support Vector Machine
Authors: Iqbaldeep Kaur, Amarjeet Kaur
Abstract:
Software Reusability is important part of software development. So component based software development in case of software testing has gained a lot of practical importance in the field of software engineering from academic researcher and also from software development industry perspective. Finding test cases for efficient reuse of test cases is one of the important problems aimed by researcher. Clustering reduce the search space, reuse test cases by grouping similar entities according to requirements ensuring reduced time complexity as it reduce the search time for retrieval the test cases. In this research paper we proposed approach for re-usability of test cases by unsupervised approach. In unsupervised learning we proposed k-mean and Support Vector Machine. We have designed the algorithm for requirement and test case document clustering according to its tf-idf vector space and the output is set of highly cohesive pattern groups.Keywords: software testing, reusability, clustering, k-mean, SVM
Procedia PDF Downloads 43012981 Surface Thermodynamics Approach to Mycobacterium tuberculosis (M-TB) – Human Sputum Interactions
Authors: J. L. Chukwuneke, C. H. Achebe, S. N. Omenyi
Abstract:
This research work presents the surface thermodynamics approach to M-TB/HIV-Human sputum interactions. This involved the use of the Hamaker coefficient concept as a surface energetics tool in determining the interaction processes, with the surface interfacial energies explained using van der Waals concept of particle interactions. The Lifshitz derivation for van der Waals forces was applied as an alternative to the contact angle approach which has been widely used in other biological systems. The methodology involved taking sputum samples from twenty infected persons and from twenty uninfected persons for absorbance measurement using a digital Ultraviolet visible Spectrophotometer. The variables required for the computations with the Lifshitz formula were derived from the absorbance data. The Matlab software tools were used in the mathematical analysis of the data produced from the experiments (absorbance values). The Hamaker constants and the combined Hamaker coefficients were obtained using the values of the dielectric constant together with the Lifshitz equation. The absolute combined Hamaker coefficients A132abs and A131abs on both infected and uninfected sputum samples gave the values of A132abs = 0.21631x10-21Joule for M-TB infected sputum and Ã132abs = 0.18825x10-21Joule for M-TB/HIV infected sputum. The significance of this result is the positive value of the absolute combined Hamaker coefficient which suggests the existence of net positive van der waals forces demonstrating an attraction between the bacteria and the macrophage. This however, implies that infection can occur. It was also shown that in the presence of HIV, the interaction energy is reduced by 13% conforming adverse effects observed in HIV patients suffering from tuberculosis.Keywords: absorbance, dielectric constant, hamaker coefficient, lifshitz formula, macrophage, mycobacterium tuberculosis, van der waals forces
Procedia PDF Downloads 27612980 Building Transparent Supply Chains through Digital Tracing
Authors: Penina Orenstein
Abstract:
In today’s world, particularly with COVID-19 a constant worldwide threat, organizations need greater visibility over their supply chains more than ever before, in order to find areas for improvement and greater efficiency, reduce the chances of disruption and stay competitive. The concept of supply chain mapping is one where every process and route is mapped in detail between each vendor and supplier. The simplest method of mapping involves sourcing publicly available data including news and financial information concerning relationships between suppliers. An additional layer of information would be disclosed by large, direct suppliers about their production and logistics sites. While this method has the advantage of not requiring any input from suppliers, it also doesn’t allow for much transparency beyond the first supplier tier and may generate irrelevant data—noise—that must be filtered out to find the actionable data. The primary goal of this research is to build data maps of supply chains by focusing on a layered approach. Using these maps, the secondary goal is to address the question as to whether the supply chain is re-engineered to make improvements, for example, to lower the carbon footprint. Using a drill-down approach, the end result is a comprehensive map detailing the linkages between tier-one, tier-two, and tier-three suppliers super-imposed on a geographical map. The driving force behind this idea is to be able to trace individual parts to the exact site where they’re manufactured. In this way, companies can ensure sustainability practices from the production of raw materials through the finished goods. The approach allows companies to identify and anticipate vulnerabilities in their supply chain. It unlocks predictive analytics capabilities and enables them to act proactively. The research is particularly compelling because it unites network science theory with empirical data and presents the results in a visual, intuitive manner.Keywords: data mining, supply chain, empirical research, data mapping
Procedia PDF Downloads 17512979 Improvement of Brige Weigh-In-Motion Technique Considering the Driving Conditions of Vehicles
Authors: Changgil Lee, Jooyoung Park, Seunghee Park
Abstract:
In this study, bridge weigh-in-motion (BWIM) system was simulated under various driving conditions of vehicles to improve the performance of the BWIM system. Two driving conditions were considered. One was the number of the axle of the vehicles. Since the vehicles have different number of axle according to the types of the vehicle, the vehicles were modeled considering the number of the axle. The other was the speed of the vehicles because the speed of the vehicles is not consistent on the bridge. To achieve the goal, the dynamic characteristics of a bridge such as modal parameters were considered in numerical simulation by analyzing precision models. Also, the driving vehicles were modeled as mass-spring-damping systems reflecting the axle information.Keywords: bridge weigh-in-motion (BWIM) system, driving conditions, precision analysis model, the number of axle, the speed of vehicle
Procedia PDF Downloads 46912978 Technology Computer Aided Design Simulation of Space Charge Limited Conduction in Polycrystalline Thin Films
Authors: Kunj Parikh, S. Bhattacharya, V. Natarajan
Abstract:
TCAD numerical simulation is one of the most tried and tested powerful tools for designing devices in semiconductor foundries worldwide. It has also been used to explain conduction in organic thin films where the processing temperature is often enough to make homogeneous samples (often imperfect, but homogeneously imperfect). In this report, we have presented the results of TCAD simulation in multi-grain thin films. The work has addressed the inhomogeneity in one dimension, but can easily be extended to two and three dimensions. The effect of grain boundaries has mainly been approximated as barriers located at the junction between two adjacent grains. The effect of the value of grain boundary barrier, the bulk traps, and the measurement temperature have been investigated.Keywords: polycrystalline thin films, space charge limited conduction, Technology Computer-Aided Design (TCAD) simulation, traps
Procedia PDF Downloads 21512977 Matric Suction Effects on Behavior of Unsaturated Soil Slope
Authors: Mohsen Mousivand, Hesam Aminpour
Abstract:
Soil slopes are usually located above the groundwater level that are largely unsaturated. It is possible that unsaturated soil of slope has expanded or collapsed as a result of wetting by rain or other factor that this type of soil behavior can cause serious problems including human and financial damage. The main factor causing this difference in behavior of saturated and unsaturated state of soil is matric suction that is created by interface of the soil and water in the soil pores. So far theoretical studies show that matric suction has important effect on the mechanical behavior of soil although the impact of this factor on slope stability has not been studied. This paper presents a numerical study of effect of matric suction on slope stability. The results of the study indicate that safety factor and stability of soil slope increase due to an increasing of matric suction and in view of matric suction leads to more accurate results and safety factor.Keywords: slope, unsaturated soil, matric suction, stability
Procedia PDF Downloads 33312976 A Positive Neuroscience Perspective for Child Development and Special Education
Authors: Amedeo D'Angiulli, Kylie Schibli
Abstract:
Traditionally, children’s brain development research has emphasized the limitative aspects of disability and impairment, electing as an explanatory model the classical clinical notions of brain lesion or functional deficit. In contrast, Positive Educational Neuroscience (PEN) is a new approach that emphasizes strengths and human flourishing related to the brain by exploring how learning practices have the potential to enhance neurocognitive flexibility through neuroplastic overcompensation. This mini-review provides an overview of PEN and shows how it links to the concept of neurocognitive flexibility. We provide examples of how the present concept of neurocognitive flexibility can be applied to special education by exploring examples of neuroplasticity in the learning domain, including: (1) learning to draw in congenitally totally blind children, and (2) music training in children from disadvantaged neighborhoods. PEN encourages educators to focus on children’s strengths by recognizing the brain’s capacity for positive change and to incorporate activities that support children’s individual development.Keywords: neurocognitive development, positive educational neuroscience, sociocultural approach, special education
Procedia PDF Downloads 24112975 Emerging Therapeutic Approach with Dandelion Phytochemicals in Breast Cancer Treatment
Authors: Angel Champion, Sadia Kanwal, Rafat Siddiqui
Abstract:
Harnessing phytochemicals from plant sources presents a novel opportunity to prevent or treat malignant diseases, including breast cancer. Chemotherapy lacks precision in targeting cancerous cells while sparing normal cells, but a phytopharmaceutical approach may offer a solution. Dandelion, a common weed plant, is rich in phytochemicals and provides a safer, more cost-effective alternative with lower toxicity than traditional pharmaceuticals for conditions such as breast cancer. In this study, an in-vitro experiment will be conducted using the ethanol extract of Dandelion on triple-negative MDA-231 breast cancer cell lines. The polyphenolic analysis revealed that the Dandelion extract, particularly from the root and leaf (both cut and sifted), had the most potent antioxidant properties and exhibited the most potent antioxidation activity from the powdered leaf extract. The extract exhibits prospective promising effects for inducing cell proliferation and apoptosis in breast cancer cells, highlighting its potential for targeted therapeutic interventions. Standardizing methods for Dandelion use is crucial for future clinical applications in cancer treatment. Combining plant-derived compounds with cancer nanotechnology holds the potential for effective strategies in battling malignant diseases. Utilizing liposomes as carriers for phytoconstituent anti-cancer agents offers improved solubility, bioavailability, immunoregulatory effects, advancing anticancer immune function, and reducing toxicity. This integrated approach of natural products and nanotechnology has significant potential to revolutionize healthcare globally, especially in underserved communities where herbal medicine is prevalent.Keywords: apoptosis, antioxidant activity, cancer nanotechnology, phytopharmaceutical
Procedia PDF Downloads 5412974 Sensitivity Analysis and Solitary Wave Solutions to the (2+1)-Dimensional Boussinesq Equation in Dispersive Media
Authors: Naila Nasreen, Dianchen Lu
Abstract:
This paper explores the dynamical behavior of the (2+1)-dimensional Boussinesq equation, which is a nonlinear water wave equation and is used to model wave packets in dispersive media with weak nonlinearity. This equation depicts how long wave made in shallow water propagates due to the influence of gravity. The (2+1)- dimensional Boussinesq equation combines the two-way propagation of the classical Boussinesq equation with the dependence on a second spatial variable, as that occurs in the two-dimensional Kadomstev- Petviashvili equation. This equation provides a description of head- on collision of oblique waves and it possesses some interesting properties. The governing model is discussed by the assistance of Ricatti equation mapping method, a relatively integration tool. The solutions have been extracted in different forms the solitary wave solutions as well as hyperbolic and periodic solutions. Moreover, the sensitivity analysis is demonstrated for the designed dynamical structural system’s wave profiles, where the soliton wave velocity and wave number parameters regulate the water wave singularity. In addition to being helpful for elucidating nonlinear partial differential equations, the method in use gives previously extracted solutions and extracts fresh exact solutions. Assuming the right values for the parameters, various graph in different shapes are sketched to provide information about the visual format of the earned results. This paper’s findings support the efficacy of the approach taken in enhancing nonlinear dynamical behavior. We believe this research will be of interest to a wide variety of engineers that work with engineering models. Findings show the effectiveness simplicity, and generalizability of the chosen computational approach, even when applied to complicated systems in a variety of fields, especially in ocean engineering.Keywords: (2+1)-dimensional Boussinesq equation, solitary wave solutions, Ricatti equation mapping approach, nonlinear phenomena
Procedia PDF Downloads 10112973 Degraded Document Analysis and Extraction of Original Text Document: An Approach without Optical Character Recognition
Authors: L. Hamsaveni, Navya Prakash, Suresha
Abstract:
Document Image Analysis recognizes text and graphics in documents acquired as images. An approach without Optical Character Recognition (OCR) for degraded document image analysis has been adopted in this paper. The technique involves document imaging methods such as Image Fusing and Speeded Up Robust Features (SURF) Detection to identify and extract the degraded regions from a set of document images to obtain an original document with complete information. In case, degraded document image captured is skewed, it has to be straightened (deskew) to perform further process. A special format of image storing known as YCbCr is used as a tool to convert the Grayscale image to RGB image format. The presented algorithm is tested on various types of degraded documents such as printed documents, handwritten documents, old script documents and handwritten image sketches in documents. The purpose of this research is to obtain an original document for a given set of degraded documents of the same source.Keywords: grayscale image format, image fusing, RGB image format, SURF detection, YCbCr image format
Procedia PDF Downloads 37712972 Analytical Formulae for Parameters Involved in Side Slopes of Embankments Stability
Authors: Abdulrahman Abdulrahman, Abir Abdulrahman
Abstract:
The stability of slopes of earthen embankments is usually examined by Swedish slip circle method or the slices method. The factor of safety against sliding using Fellenius procedure depends upon the angle formed by the arc of sliding at the center ψ and the radius of the slip circle r. The values of both mentioned parameters ψ and r aren't precisely predicted because they are measured from the drawing. In this paper, analytical formulae were derived for finding the exact values of both ψ and r. Also this paper presents the different conditions of intersections the slip circle with the body of an earthen dam and the coordinate of intersection points. Numerical examples are chosen for demonstration the proposed solutionKeywords: earthen dams stability, , earthen embankments stability, , Fellenius method, hydraulic structures, , side slopes stability, , slices method, Swedish slip circle
Procedia PDF Downloads 16512971 On the Impracticality of Kierkegaard's Community of Authentic Individuals
Authors: Andrew Ka Pok Tam
Abstract:
Kierkegaard has been misinterpreted as an anti-social philosopher for a long time until in recent years when there are more discussions on his concept of community in Journals and Papers inspired by Karl Bayer. Community which is based upon an individual's relations to others is different from the crowd or the public where the numerical or the majority make decisions. As a result, authenticity is only possible in the community. But Kierkegaard did not explain how we can preserve the individual's authenticity by establishing a community instead of a public in the reality. Kierkegaard was against the democratic reform in 1848 Denmark because he thought all elections mean the majority wins and the authenticity of a single individual would be suppressed. However, Kierkegaard himself does not suggest an alternative political system that may preserve the authenticity of individual. This paper aims to evaluate the possibility for us to establish a Kierkegaadian community in practice so as to preserve every individual's authenticity. This paper argues that the practicality of Kierekegaadian community is limited. In order to have effective communications and relations among individuals, a Kierkegaardian community must be small and inefficient as every individual's must remain authentic in all political decision for the whole community.Keywords: authenticity, community, individual, kierkegaard
Procedia PDF Downloads 36112970 Automatic Classification of Periodic Heart Sounds Using Convolutional Neural Network
Authors: Jia Xin Low, Keng Wah Choo
Abstract:
This paper presents an automatic normal and abnormal heart sound classification model developed based on deep learning algorithm. MITHSDB heart sounds datasets obtained from the 2016 PhysioNet/Computing in Cardiology Challenge database were used in this research with the assumption that the electrocardiograms (ECG) were recorded simultaneously with the heart sounds (phonocardiogram, PCG). The PCG time series are segmented per heart beat, and each sub-segment is converted to form a square intensity matrix, and classified using convolutional neural network (CNN) models. This approach removes the need to provide classification features for the supervised machine learning algorithm. Instead, the features are determined automatically through training, from the time series provided. The result proves that the prediction model is able to provide reasonable and comparable classification accuracy despite simple implementation. This approach can be used for real-time classification of heart sounds in Internet of Medical Things (IoMT), e.g. remote monitoring applications of PCG signal.Keywords: convolutional neural network, discrete wavelet transform, deep learning, heart sound classification
Procedia PDF Downloads 34912969 Constant Order Predictor Corrector Method for the Solution of Modeled Problems of First Order IVPs of ODEs
Authors: A. A. James, A. O. Adesanya, M. R. Odekunle, D. G. Yakubu
Abstract:
This paper examines the development of one step, five hybrid point method for the solution of first order initial value problems. We adopted the method of collocation and interpolation of power series approximate solution to generate a continuous linear multistep method. The continuous linear multistep method was evaluated at selected grid points to give the discrete linear multistep method. The method was implemented using a constant order predictor of order seven over an overlapping interval. The basic properties of the derived corrector was investigated and found to be zero stable, consistent and convergent. The region of absolute stability was also investigated. The method was tested on some numerical experiments and found to compete favorably with the existing methods.Keywords: interpolation, approximate solution, collocation, differential system, half step, converges, block method, efficiency
Procedia PDF Downloads 33712968 An Analysis on Clustering Based Gene Selection and Classification for Gene Expression Data
Authors: K. Sathishkumar, V. Thiagarasu
Abstract:
Due to recent advances in DNA microarray technology, it is now feasible to obtain gene expression profiles of tissue samples at relatively low costs. Many scientists around the world use the advantage of this gene profiling to characterize complex biological circumstances and diseases. Microarray techniques that are used in genome-wide gene expression and genome mutation analysis help scientists and physicians in understanding of the pathophysiological mechanisms, in diagnoses and prognoses, and choosing treatment plans. DNA microarray technology has now made it possible to simultaneously monitor the expression levels of thousands of genes during important biological processes and across collections of related samples. Elucidating the patterns hidden in gene expression data offers a tremendous opportunity for an enhanced understanding of functional genomics. However, the large number of genes and the complexity of biological networks greatly increase the challenges of comprehending and interpreting the resulting mass of data, which often consists of millions of measurements. A first step toward addressing this challenge is the use of clustering techniques, which is essential in the data mining process to reveal natural structures and identify interesting patterns in the underlying data. This work presents an analysis of several clustering algorithms proposed to deals with the gene expression data effectively. The existing clustering algorithms like Support Vector Machine (SVM), K-means algorithm and evolutionary algorithm etc. are analyzed thoroughly to identify the advantages and limitations. The performance evaluation of the existing algorithms is carried out to determine the best approach. In order to improve the classification performance of the best approach in terms of Accuracy, Convergence Behavior and processing time, a hybrid clustering based optimization approach has been proposed.Keywords: microarray technology, gene expression data, clustering, gene Selection
Procedia PDF Downloads 32312967 A General Iterative Nonlinear Programming Method to Synthesize Heat Exchanger Network
Authors: Rupu Yang, Cong Toan Tran, Assaad Zoughaib
Abstract:
The work provides an iterative nonlinear programming method to synthesize a heat exchanger network by manipulating the trade-offs between the heat load of process heat exchangers (HEs) and utilities. We consider for the synthesis problem two cases, the first one without fixed cost for HEs, and the second one with fixed cost. For the no fixed cost problem, the nonlinear programming (NLP) model with all the potential HEs is optimized to obtain the global optimum. For the case with fixed cost, the NLP model is iterated through adding/removing HEs. The method was applied in five case studies and illustrated quite well effectiveness. Among which, the approach reaches the lowest TAC (2,904,026$/year) compared with the best record for the famous Aromatic plants problem. It also locates a slightly better design than records in literature for a 10 streams case without fixed cost with only 1/9 computational time. Moreover, compared to the traditional mixed-integer nonlinear programming approach, the iterative NLP method opens a possibility to consider constraints (such as controllability or dynamic performances) that require knowing the structure of the network to be calculated.Keywords: heat exchanger network, synthesis, NLP, optimization
Procedia PDF Downloads 16412966 Changes in Geospatial Structure of Households in the Czech Republic: Findings from Population and Housing Census
Authors: Jaroslav Kraus
Abstract:
Spatial information about demographic processes are a standard part of outputs in the Czech Republic. That was also the case of Population and Housing Census which was held on 2011. This is a starting point for a follow up study devoted to two basic types of households: single person households and households of one completed family. Single person households and one family households create more than 80 percent of all households, but the share and spatial structure is in long-term changing. The increase of single households is results of long-term fertility decrease and divorce increase, but also possibility of separate living. There are regions in the Czech Republic with traditional demographic behavior, and regions like capital Prague and some others with changing pattern. Population census is based - according to international standards - on the concept of currently living population. Three types of geospatial approaches will be used for analysis: (i) firstly measures of geographic distribution, (ii) secondly mapping clusters to identify the locations of statistically significant hot spots, cold spots, spatial outliers, and similar features and (iii) finally analyzing pattern approach as a starting point for more in-depth analyses (geospatial regression) in the future will be also applied. For analysis of this type of data, number of households by types should be distinct objects. All events in a meaningful delimited study region (e.g. municipalities) will be included in an analysis. Commonly produced measures of central tendency and spread will include: identification of the location of the center of the point set (by NUTS3 level); identification of the median center and standard distance, weighted standard distance and standard deviational ellipses will be also used. Identifying that clustering exists in census households datasets does not provide a detailed picture of the nature and pattern of clustering but will be helpful to apply simple hot-spot (and cold spot) identification techniques to such datasets. Once the spatial structure of households will be determined, any particular measure of autocorrelation can be constructed by defining a way of measuring the difference between location attribute values. The most widely used measure is Moran’s I that will be applied to municipal units where numerical ratio is calculated. Local statistics arise naturally out of any of the methods for measuring spatial autocorrelation and will be applied to development of localized variants of almost any standard summary statistic. Local Moran’s I will give an indication of household data homogeneity and diversity on a municipal level.Keywords: census, geo-demography, households, the Czech Republic
Procedia PDF Downloads 9712965 Preparation of Papers - Developing a Leukemia Diagnostic System Based on Hybrid Deep Learning Architectures in Actual Clinical Environments
Authors: Skyler Kim
Abstract:
An early diagnosis of leukemia has always been a challenge to doctors and hematologists. On a worldwide basis, it was reported that there were approximately 350,000 new cases in 2012, and diagnosing leukemia was time-consuming and inefficient because of an endemic shortage of flow cytometry equipment in current clinical practice. As the number of medical diagnosis tools increased and a large volume of high-quality data was produced, there was an urgent need for more advanced data analysis methods. One of these methods was the AI approach. This approach has become a major trend in recent years, and several research groups have been working on developing these diagnostic models. However, designing and implementing a leukemia diagnostic system in real clinical environments based on a deep learning approach with larger sets remains complex. Leukemia is a major hematological malignancy that results in mortality and morbidity throughout different ages. We decided to select acute lymphocytic leukemia to develop our diagnostic system since acute lymphocytic leukemia is the most common type of leukemia, accounting for 74% of all children diagnosed with leukemia. The results from this development work can be applied to all other types of leukemia. To develop our model, the Kaggle dataset was used, which consists of 15135 total images, 8491 of these are images of abnormal cells, and 5398 images are normal. In this paper, we design and implement a leukemia diagnostic system in a real clinical environment based on deep learning approaches with larger sets. The proposed diagnostic system has the function of detecting and classifying leukemia. Different from other AI approaches, we explore hybrid architectures to improve the current performance. First, we developed two independent convolutional neural network models: VGG19 and ResNet50. Then, using both VGG19 and ResNet50, we developed a hybrid deep learning architecture employing transfer learning techniques to extract features from each input image. In our approach, fusing the features from specific abstraction layers can be deemed as auxiliary features and lead to further improvement of the classification accuracy. In this approach, features extracted from the lower levels are combined into higher dimension feature maps to help improve the discriminative capability of intermediate features and also overcome the problem of network gradient vanishing or exploding. By comparing VGG19 and ResNet50 and the proposed hybrid model, we concluded that the hybrid model had a significant advantage in accuracy. The detailed results of each model’s performance and their pros and cons will be presented in the conference.Keywords: acute lymphoblastic leukemia, hybrid model, leukemia diagnostic system, machine learning
Procedia PDF Downloads 18712964 Sentiment Analysis: An Enhancement of Ontological-Based Features Extraction Techniques and Word Equations
Authors: Mohd Ridzwan Yaakub, Muhammad Iqbal Abu Latiffi
Abstract:
Online business has become popular recently due to the massive amount of information and medium available on the Internet. This has resulted in the huge number of reviews where the consumers share their opinion, criticisms, and satisfaction on the products they have purchased on the websites or the social media such as Facebook and Twitter. However, to analyze customer’s behavior has become very important for organizations to find new market trends and insights. The reviews from the websites or the social media are in structured and unstructured data that need a sentiment analysis approach in analyzing customer’s review. In this article, techniques used in will be defined. Definition of the ontology and description of its possible usage in sentiment analysis will be defined. It will lead to empirical research that related to mobile phones used in research and the ontology used in the experiment. The researcher also will explore the role of preprocessing data and feature selection methodology. As the result, ontology-based approach in sentiment analysis can help in achieving high accuracy for the classification task.Keywords: feature selection, ontology, opinion, preprocessing data, sentiment analysis
Procedia PDF Downloads 20012963 Numerical Investigation of Aerodynamic Analysis on Passenger Vehicle
Authors: Cafer Görkem Pınar, İlker Coşar, Serkan Uzun, Atahan Çelebi, Mehmet Ali Ersoy, Ali Pınarbaşı
Abstract:
In this study, it was numerically investigated that a 1:1 scale model of the Renault Clio MK4 SW brand vehicle aerodynamic analysis was performed in the commercial computational fluid dynamics (CFD) package program of ANSYS CFX 2021 R1 under steady, subsonic, and 3-D conditions. The model of vehicle used for the analysis was made independent of the number of mesh elements, and the k-epsilon turbulence model was applied during the analysis. Results were interpreted as streamlines, pressure gradient, and turbulent kinetic energy contours around the vehicle at 50 km/h and 100 km/h speeds. In addition, the validity of the analysis was decided by comparing the drag coefficient of the vehicle with the values in the literature. As a result, the pressure gradient contours of the taillight of the Renault Clio MK4 SW vehicle were examined, and the behavior of the total force at speeds of 50 km/h and 100 km/h was interpreted.Keywords: CFD, k-epsilon, aerodynamics, drag coefficient, taillight
Procedia PDF Downloads 14312962 Global Pandemic of Chronic Diseases: Public Health Challenges to Reduce the Development
Authors: Benjamin Poku
Abstract:
Purpose: The purpose of the research is to conduct systematic reviews and synthesis of existing knowledge that addresses the growing incidence and prevalence of chronic diseases across the world and its impact on public health in relation to communicable diseases. Principal results: A careful compilation and summary of 15-20 peer-reviewed publications from reputable databases such as PubMed, MEDLINE, CINAHL, and other peer-reviewed journals indicate that the Global pandemic of Chronic diseases (such as diabetes, high blood pressure, etc.) have become a greater public health burden in proportion as compared to communicable diseases. Significant conclusions: Given the complexity of the situation, efforts and strategies to mitigate the negative effect of the Global Pandemic on chronic diseases within the global community must include not only urgent and binding commitment of all stakeholders but also a multi-sectorial long-term approach to increase the public health educational approach to meet the increasing world population of over 8 billion people and also the aging population as well to meet the complex challenges of chronic diseases.Keywords: pandemic, chronic disease, public health, health challenges
Procedia PDF Downloads 52712961 Forest Fire Burnt Area Assessment in a Part of West Himalayan Region Using Differenced Normalized Burnt Ratio and Neural Network Approach
Authors: Sunil Chandra, Himanshu Rawat, Vikas Gusain, Triparna Barman
Abstract:
Forest fires are a recurrent phenomenon in the Himalayan region owing to the presence of vulnerable forest types, topographical gradients, climatic weather conditions, and anthropogenic pressure. The present study focuses on the identification of forest fire-affected areas in a small part of the West Himalayan region using a differential normalized burnt ratio method and spectral unmixing methods. The study area has a rugged terrain with the presence of sub-tropical pine forest, montane temperate forest, and sub-alpine forest and scrub. The major reason for fires in this region is anthropogenic in nature, with the practice of human-induced fires for getting fresh leaves, scaring wild animals to protect agricultural crops, grazing practices within reserved forests, and igniting fires for cooking and other reasons. The fires caused by the above reasons affect a large area on the ground, necessitating its precise estimation for further management and policy making. In the present study, two approaches have been used for carrying out a burnt area analysis. The first approach followed for burnt area analysis uses a differenced normalized burnt ratio (dNBR) index approach that uses the burnt ratio values generated using the Short-Wave Infrared (SWIR) band and Near Infrared (NIR) bands of the Sentinel-2 image. The results of the dNBR have been compared with the outputs of the spectral mixing methods. It has been found that the dNBR is able to create good results in fire-affected areas having homogenous forest stratum and with slope degree <5 degrees. However, in a rugged terrain where the landscape is largely influenced by the topographical variations, vegetation types, tree density, the results may be largely influenced by the effects of topography, complexity in tree composition, fuel load composition, and soil moisture. Hence, such variations in the factors influencing burnt area assessment may not be effectively carried out using a dNBR approach which is commonly followed for burnt area assessment over a large area. Hence, another approach that has been attempted in the present study utilizes a spectral mixing method where the individual pixel is tested before assigning an information class to it. The method uses a neural network approach utilizing Sentinel-2 bands. The training and testing data are generated from the Sentinel-2 data and the national field inventory, which is further used for generating outputs using ML tools. The analysis of the results indicates that the fire-affected regions and their severity can be better estimated using spectral unmixing methods, which have the capability to resolve the noise in the data and can classify the individual pixel to the precise burnt/unburnt class.Keywords: categorical data, log linear modeling, neural network, shifting cultivation
Procedia PDF Downloads 56