Search results for: doubly connected domain
1745 Simulation of 1D Dielectric Barrier Discharge in Argon Mixtures
Authors: Lucas Wilman Crispim, Patrícia Hallack, Maikel Ballester
Abstract:
This work aims at modeling electric discharges in gas mixtures. The mathematical model mimics the ignition process in a commercial spark-plug when a high voltage is applied to the plug terminals. A longitudinal unidimensional Cartesian domain is chosen for the simulation region. Energy and mass transfer are considered for a macroscopic fluid representation, while energy transfer in molecular collisions and chemical reactions are contemplated at microscopic level. The macroscopic model is represented by a set of uncoupled partial differential equations. Microscopic effects are studied within a discrete model for electronic and molecular collisions in the frame of ZDPlasKin, a plasma modeling numerical tool. The BOLSIG+ solver is employed in solving the electronic Boltzmann equation. An operator splitting technique is used to separate microscopic and macroscopic models. The simulation gas is a mixture of atomic Argon neutral, excited and ionized. Spatial and temporal evolution of such species and temperature are presented and discussed.Keywords: CFD, electronic discharge, ignition, spark plug
Procedia PDF Downloads 1601744 The Profit Trend of Cosmetics Products Using Bootstrap Edgeworth Approximation
Authors: Edlira Donefski, Lorenc Ekonomi, Tina Donefski
Abstract:
Edgeworth approximation is one of the most important statistical methods that has a considered contribution in the reduction of the sum of standard deviation of the independent variables’ coefficients in a Quantile Regression Model. This model estimates the conditional median or other quantiles. In this paper, we have applied approximating statistical methods in an economical problem. We have created and generated a quantile regression model to see how the profit gained is connected with the realized sales of the cosmetic products in a real data, taken from a local business. The Linear Regression of the generated profit and the realized sales was not free of autocorrelation and heteroscedasticity, so this is the reason that we have used this model instead of Linear Regression. Our aim is to analyze in more details the relation between the variables taken into study: the profit and the finalized sales and how to minimize the standard errors of the independent variable involved in this study, the level of realized sales. The statistical methods that we have applied in our work are Edgeworth Approximation for Independent and Identical distributed (IID) cases, Bootstrap version of the Model and the Edgeworth approximation for Bootstrap Quantile Regression Model. The graphics and the results that we have presented here identify the best approximating model of our study.Keywords: bootstrap, edgeworth approximation, IID, quantile
Procedia PDF Downloads 1581743 Analysis of an Error Estimate for the Asymptotic Solution of the Heat Conduction Problem in a Dilated Pipe
Authors: E. Marušić-Paloka, I. Pažanin, M. Prša
Abstract:
Subject of this study is the stationary heat conduction problem through a pipe filled with incompressible viscous fluid. In previous work, we observed the existence and uniqueness theorems for the corresponding boundary-value problem and within we have taken into account the effects of the pipe's dilatation due to the temperature of the fluid inside of the pipe. The main difficulty comes from the fact that flow domain changes depending on the solution of the observed heat equation leading to a non-standard coupled governing problem. The goal of this work is to find solution estimate since the exact solution of the studied problem is not possible to determine. We use an asymptotic expansion in order of a small parameter which is presented as a heat expansion coefficient of the pipe's material. Furthermore, an error estimate is provided for the mentioned asymptotic approximation of the solution for inner area of the pipe. Close to the boundary, problem becomes more complex so different approaches are observed, mainly Theory of Perturbations and Separations of Variables. In view of that, error estimate for the whole approximation will be provided with additional software simulations of gotten situation.Keywords: asymptotic analysis, dilated pipe, error estimate, heat conduction
Procedia PDF Downloads 2341742 Efficient GIS Based Public Health System for Disease Prevention
Authors: K. M. G. T. R. Waidyarathna, S. M. Vidanagamachchi
Abstract:
Public Health System exists in Sri Lanka has a satisfactory complete information flow when compared to other systems in developing countries. The availability of a good health information system contributed immensely to achieve health indices that are in line with the developed countries like US and UK. The health information flow at the moment is completely paper based. In Sri Lanka, the fields like banking, accounting and engineering have incorporated information and communication technology to the same extent that can be observed in any other country. The field of medicine has behind those fields throughout the world mainly due to its complexity, issues like privacy, confidentially and lack of people with knowledge in both fields of Information Technology (IT) and Medicine. Sri Lanka’s situation is much worse and the gap is rapidly increasing with huge IT initiatives by private-public partnerships in all other countries. The major goal of the framework is to support minimizing the spreading diseases. To achieve that a web based framework should be implemented for this application domain with web mapping. The aim of this GIS based public health system is a secure, flexible, easy to maintain environment for creating and maintaining public health records and easy to interact with relevant parties.Keywords: DHIS2, GIS, public health, Sri Lanka
Procedia PDF Downloads 5621741 Proposing an Architecture for Drug Response Prediction by Integrating Multiomics Data and Utilizing Graph Transformers
Authors: Nishank Raisinghani
Abstract:
Efficiently predicting drug response remains a challenge in the realm of drug discovery. To address this issue, we propose four model architectures that combine graphical representation with varying positions of multiheaded self-attention mechanisms. By leveraging two types of multi-omics data, transcriptomics and genomics, we create a comprehensive representation of target cells and enable drug response prediction in precision medicine. A majority of our architectures utilize multiple transformer models, one with a graph attention mechanism and the other with a multiheaded self-attention mechanism, to generate latent representations of both drug and omics data, respectively. Our model architectures apply an attention mechanism to both drug and multiomics data, with the goal of procuring more comprehensive latent representations. The latent representations are then concatenated and input into a fully connected network to predict the IC-50 score, a measure of cell drug response. We experiment with all four of these architectures and extract results from all of them. Our study greatly contributes to the future of drug discovery and precision medicine by looking to optimize the time and accuracy of drug response prediction.Keywords: drug discovery, transformers, graph neural networks, multiomics
Procedia PDF Downloads 1521740 Propagation of the Effects of Certain Types of Military Psychological Operations in a Networked Population
Authors: Colette Faucher
Abstract:
In modern asymmetric conflicts, the Armed Forces generally have to intervene in countries where the internal peace is in danger. They must make the local population an ally in order to be able to deploy the necessary military actions with its support. For this purpose, psychological operations (PSYOPs) are used to shape people’s behaviors and emotions by the modification of their attitudes in acting on their perceptions. PSYOPs aim at elaborating and spreading a message that must be read, listened to and/or looked at, then understood by the info-targets in order to get from them the desired behavior. A message can generate in the info-targets, reasoned thoughts, spontaneous emotions or reflex behaviors, this effect partly depending on the means of conveyance used to spread this message. In this paper, we focus on psychological operations that generate emotions. We present a method based on the Intergroup Emotion Theory, that determines, from the characteristics of the conveyed message and of the people from the population directly reached by the means of conveyance (direct info-targets), the emotion likely to be triggered in them and we simulate the propagation of the effects of such a message on indirect info-targets that are connected to them through the social networks that structure the population.Keywords: military psychological operations, social identity, social network, emotion propagation
Procedia PDF Downloads 4091739 Free Vibration Analysis of Conical Helicoidal Rods Having Elliptical Cross Sections Positioned in Different Orientation
Authors: Merve Ermis, Akif Kutlu, Nihal Eratlı, Mehmet H. Omurtag
Abstract:
In this study, the free vibration analysis of conical helicoidal rods with two different elliptically oriented cross sections is investigated and the results are compared by the circular cross-section keeping the net area for all cases equal to each other. Problems are solved by using the mixed finite element formulation. Element matrices based on Timoshenko beam theory are employed. The finite element matrices are derived by directly inserting the analytical expressions (arc length, curvature, and torsion) defining helix geometry into the formulation. Helicoidal rod domain is discretized by a two-noded curvilinear element. Each node of the element has 12 DOFs, namely, three translations, three rotations, two shear forces, one axial force, two bending moments and one torque. A parametric study is performed to investigate the influence of elliptical cross sectional geometry and its orientation over the natural frequencies of the conical type helicoidal rod.Keywords: conical helix, elliptical cross section, finite element, free vibration
Procedia PDF Downloads 3131738 Adaptive Nonlinear Control of a Variable Speed Horizontal Axis Wind Turbine: Controller for Optimal Power Capture
Authors: Rana M. Mostafa, Nouby M. Ghazaly, Ahmed S. Ali
Abstract:
This article introduces a solution for increasing the wind energy extracted from turbines to overcome the more electric power required. This objective provides a new science discipline; wind turbine control. This field depends on the development in power electronics to provide new control strategies for turbines. Those strategies should deal with all turbine operating modes. Here there are two control strategies developed for variable speed horizontal axis wind turbine for rated and over rated wind speed regions. These strategies will support wind energy validation, decrease manufacturing overhead cost. Here nonlinear adaptive method was used to design speed controllers to a scheme for ‘Aeolos50 kw’ wind turbine connected to permanent magnet generator via a gear box which was built on MATLAB/Simulink. These controllers apply maximum power point tracking concept to guarantee goal achievement. Procedures were carried to test both controllers efficiency. The results had been shown that the developed controllers are acceptable and this can be easily declared from simulation results.Keywords: adaptive method, pitch controller, wind energy, nonlinear control
Procedia PDF Downloads 2411737 Battery State of Charge Management Algorithm for Photovoltaic Ramp Rate Control
Authors: Nam Kyu Kim, Hee Jun Cha, Jae Jin Seo, Dong Jun Won
Abstract:
Output power of a photovoltaic (PV) generator depends on incident solar irradiance. If the clouds pass or the climate condition is bad, the PV output fluctuates frequently. When PV generator is connected to the grid, these fluctuations adversely affect power quality. Thus, ramp rate control with battery energy storage system (BESS) is needed to reduce PV output fluctuations. At the same time, for effective BESS operation and sizing the optimal BESS capacity, managing state of charge (SOC) is the most important part. In addition, managing SOC helps to avoid violating the SOC operating range of BESS when performing renewable integration (RI) continuously. As PV and BESS increase, the SOC management of BESS will become more important in the future. This paper presents the SOC management algorithm which helps to operate effectively BESS, and has focused on method to manage SOC while reducing PV output fluctuations. A simulation model is developed in PSCAD/EMTDC software. The simulation results show that the SOC is maintained within the operating range by adjusting the output distribution according to the SOC of the BESS.Keywords: battery energy storage system, ramp rate control, renewable integration, SOC management
Procedia PDF Downloads 1781736 Temporal and Spacial Adaptation Strategies in Aerodynamic Simulation of Bluff Bodies Using Vortex Particle Methods
Authors: Dario Milani, Guido Morgenthal
Abstract:
Fluid dynamic computation of wind caused forces on bluff bodies e.g light flexible civil structures or high incidence of ground approaching airplane wings, is one of the major criteria governing their design. For such structures a significant dynamic response may result, requiring the usage of small scale devices as guide-vanes in bridge design to control these effects. The focus of this paper is on the numerical simulation of the bluff body problem involving multiscale phenomena induced by small scale devices. One of the solution methods for the CFD simulation that is relatively successful in this class of applications is the Vortex Particle Method (VPM). The method is based on a grid free Lagrangian formulation of the Navier-Stokes equations, where the velocity field is modeled by particles representing local vorticity. These vortices are being convected due to the free stream velocity as well as diffused. This representation yields the main advantages of low numerical diffusion, compact discretization as the vorticity is strongly localized, implicitly accounting for the free-space boundary conditions typical for this class of FSI problems, and a natural representation of the vortex creation process inherent in bluff body flows. When the particle resolution reaches the Kolmogorov dissipation length, the method becomes a Direct Numerical Simulation (DNS). However, it is crucial to note that any solution method aims at balancing the computational cost against the accuracy achievable. In the classical VPM method, if the fluid domain is discretized by Np particles, the computational cost is O(Np2). For the coupled FSI problem of interest, for example large structures such as long-span bridges, the aerodynamic behavior may be influenced or even dominated by small structural details such as barriers, handrails or fairings. For such geometrically complex and dimensionally large structures, resolving the complete domain with the conventional VPM particle discretization might become prohibitively expensive to compute even for moderate numbers of particles. It is possible to reduce this cost either by reducing the number of particles or by controlling its local distribution. It is also possible to increase the accuracy of the solution without increasing substantially the global computational cost by computing a correction of the particle-particle interaction in some regions of interest. In this paper different strategies are presented in order to extend the conventional VPM method to reduce the computational cost whilst resolving the required details of the flow. The methods include temporal sub stepping to increase the accuracy of the particles convection in certain regions as well as dynamically re-discretizing the particle map to locally control the global and the local amount of particles. Finally, these methods will be applied on a test case and the improvements in the efficiency as well as the accuracy of the proposed extension to the method are presented. The important benefits in terms of accuracy and computational cost of the combination of these methods will be thus presented as long as their relevant applications.Keywords: adaptation, fluid dynamic, remeshing, substepping, vortex particle method
Procedia PDF Downloads 2611735 Analyzing the Factors that Cause Parallel Performance Degradation in Parallel Graph-Based Computations Using Graph500
Authors: Mustafa Elfituri, Jonathan Cook
Abstract:
Recently, graph-based computations have become more important in large-scale scientific computing as they can provide a methodology to model many types of relations between independent objects. They are being actively used in fields as varied as biology, social networks, cybersecurity, and computer networks. At the same time, graph problems have some properties such as irregularity and poor locality that make their performance different than regular applications performance. Therefore, parallelizing graph algorithms is a hard and challenging task. Initial evidence is that standard computer architectures do not perform very well on graph algorithms. Little is known exactly what causes this. The Graph500 benchmark is a representative application for parallel graph-based computations, which have highly irregular data access and are driven more by traversing connected data than by computation. In this paper, we present results from analyzing the performance of various example implementations of Graph500, including a shared memory (OpenMP) version, a distributed (MPI) version, and a hybrid version. We measured and analyzed all the factors that affect its performance in order to identify possible changes that would improve its performance. Results are discussed in relation to what factors contribute to performance degradation.Keywords: graph computation, graph500 benchmark, parallel architectures, parallel programming, workload characterization.
Procedia PDF Downloads 1461734 LORA: A Learning Outcome Modelling Approach for Higher Education
Authors: Aqeel Zeid, Hasna Anees, Mohamed Adheeb, Mohamed Rifan, Kalpani Manathunga
Abstract:
To achieve constructive alignment in a higher education program, a clear set of learning outcomes must be defined. Traditional learning outcome definition techniques such as Bloom’s taxonomy are not written to be utilized by the student. This might be disadvantageous for students in student-centric learning settings where the students are expected to formulate their own learning strategies. To solve the problem, we propose the learning outcome relation and aggregation (LORA) model. To achieve alignment, we developed learning outcome, assessment, and resource authoring tools which help teachers to tag learning outcomes during creation. A pilot study was conducted with an expert panel consisting of experienced professionals in the education domain to evaluate whether the LORA model and tools present an improvement over the traditional methods. The panel unanimously agreed that the model and tools are beneficial and effective. Moreover, it helped them model learning outcomes in a more student centric and descriptive way.Keywords: learning design, constructive alignment, Bloom’s taxonomy, learning outcome modelling
Procedia PDF Downloads 1841733 Software Engineering Inspired Cost Estimation for Process Modelling
Authors: Felix Baumann, Aleksandar Milutinovic, Dieter Roller
Abstract:
Up to this point business process management projects in general and business process modelling projects in particular could not rely on a practical and scientifically validated method to estimate cost and effort. Especially the model development phase is not covered by a cost estimation method or model. Further phases of business process modelling starting with implementation are covered by initial solutions which are discussed in the literature. This article proposes a method of filling this gap by deriving a cost estimation method from available methods in similar domains namely software development or software engineering. Software development is regarded as closely similar to process modelling as we show. After the proposition of this method different ideas for further analysis and validation of the method are proposed. We derive this method from COCOMO II and Function Point which are established methods of effort estimation in the domain of software development. For this we lay out similarities of the software development rocess and the process of process modelling which is a phase of the Business Process Management life-cycle.Keywords: COCOMO II, busines process modeling, cost estimation method, BPM COCOMO
Procedia PDF Downloads 4381732 To Know the Way to the Unknown: A Semi-Experimental Study on the Implication of Skills and Knowledge for Creative Processes in Higher Education
Authors: Mikkel Snorre Wilms Boysen
Abstract:
From a theoretical perspective, expertise is generally considered a precondition for creativity. The assumption is that an individual needs to master the common and accepted rules and techniques within a certain knowledge-domain in order to create something new and valuable. However, real life cases, and a limited amount of empirical studies, demonstrate that this assumption may be overly simple. In this article, this question is explored through a number of semi-experimental case studies conducted within the fields of music, technology, and youth culture. The studies indicate that, in various ways, expertise plays an important part in creative processes. However, the case studies also indicate that expertise sometimes leads to an entrenched perspective, in the sense that knowledge and experience may work as a path into the well-known rather than into the unknown. In this article, these issues are explored with reference to different theoretical approaches to creativity and learning, including actor-network theory, the theory of blind variation and selective retention, and Csikszentmihalyi’s system model. Finally, some educational aspects and implications of this are discussed.Keywords: creativity, expertise , education, technology
Procedia PDF Downloads 3191731 Artificial Intelligence for Cloud Computing
Authors: Sandesh Achar
Abstract:
Artificial intelligence is being increasingly incorporated into many applications across various sectors such as health, education, security, and agriculture. Recently, there has been rapid development in cloud computing technology, resulting in AI’s implementation into cloud computing to enhance and optimize the technology service rendered. The deployment of AI in cloud-based applications has brought about autonomous computing, whereby systems achieve stated results without human intervention. Despite the amount of research into autonomous computing, work incorporating AI/ML into cloud computing to enhance its performance and resource allocation remain a fundamental challenge. This paper highlights different manifestations, roles, trends, and challenges related to AI-based cloud computing models. This work reviews and highlights excellent investigations and progress in the domain. Future directions are suggested for leveraging AI/ML in next-generation computing for emerging computing paradigms such as cloud environments. Adopting AI-based algorithms and techniques to increase operational efficiency, cost savings, automation, reducing energy consumption and solving complex cloud computing issues are the major findings outlined in this paper.Keywords: artificial intelligence, cloud computing, deep learning, machine learning, internet of things
Procedia PDF Downloads 1071730 Protection and Renewal Strategies of Historical Blocks from the Perspective of “Staged Authenticity”
Authors: Xu Yingqiang, Wang Zhongde
Abstract:
In the age of stock development, the contradiction between the protection and development of historical blocks in China has become increasingly prominent, among which how to reconcile the contradiction between tourists and local residents and inherit urban culture is an important proposition. Based on this, this paper introduces the theory of " staged authenticity ", combs its development process and related research progress, constructs an analysis and research model of historical blocks based on the theory of " staged authenticity ", and puts forward the protection and renewal strategy of historical blocks from the perspective of " staged authenticity ", which provides theoretical basis for coordinating the tourism-residence contradiction and protecting urban characteristics in the protection and renewal of historical blocks. The research holds that we should pay attention to the important value of "curtain" space, rationally arrange "curtain" and divide "foreground" and "background"; extract "props" from real history and culture to restore the authenticity of "stage" scenes; clever arrangement of tour streamline, so that all scenes are connected in series rhythmically; make the "actors" perform interactively in the "foreground" space, so as to enhance the "audience" sense of scene substitution.Keywords: historic block, protection and renewal, staged authenticity, curtain
Procedia PDF Downloads 631729 High School Gain Analytics From National Assessment Program – Literacy and Numeracy and Australian Tertiary Admission Rankin Linkage
Authors: Andrew Laming, John Hattie, Mark Wilson
Abstract:
Nine Queensland Independent high schools provided deidentified student-matched ATAR and NAPLAN data for all 1217 ATAR graduates since 2020 who also sat NAPLAN at the school. Graduating cohorts from the nine schools contained a mean 100 ATAR graduates with previous NAPLAN data from their school. Excluded were vocational students (mean=27) and any ATAR graduates without NAPLAN data (mean=20). Based on Index of Community Socio-Educational Access (ICSEA) prediction, all schools had larger that predicted proportions of their students graduating with ATARs. There were an additional 173 students not releasing their ATARs to their school (14%), requiring this data to be inferred by schools. Gain was established by first converting each student’s strongest NAPLAN domain to a statewide percentile, then subtracting this result from final ATAR. The resulting ‘percentile shift’ was corrected for plausible ATAR participation at each NAPLAN level. Strongest NAPLAN domain had the highest correlation with ATAR (R2=0.58). RESULTS School mean NAPLAN scores fitted ICSEA closely (R2=0.97). Schools achieved a mean cohort gain of two ATAR rankings, but only 66% of students gained. This ranged from 46% of top-NAPLAN decile students gaining, rising to 75% achieving gains outside the top decile. The 54% of top-decile students whose ATAR fell short of prediction lost a mean 4.0 percentiles (or 6.2 percentiles prior to correction for regression to the mean). 71% of students in smaller schools gained, compared to 63% in larger schools. NAPLAN variability in each of the 13 ICSEA1100 cohorts was 17%, with both intra-school and inter-school variation of these values extremely low (0.3% to 1.8%). Mean ATAR change between years in each school was just 1.1 ATAR ranks. This suggests consecutive school cohorts and ICSEA-similar schools share very similar distributions and outcomes over time. Quantile analysis of the NAPLAN/ATAR revealed heteroscedasticity, but splines offered little additional benefit over simple linear regression. The NAPLAN/ATAR R2 was 0.33. DISCUSSION Standardised data like NAPLAN and ATAR offer educators a simple no-cost progression metric to analyse performance in conjunction with their internal test results. Change is expressed in percentiles, or ATAR shift per student, which is layperson intuitive. Findings may also reduce ATAR/vocational stream mismatch, reveal proportions of cohorts meeting or falling short of expectation and demonstrate by how much. Finally, ‘crashed’ ATARs well below expectation are revealed, which schools can reasonably work to minimise. The percentile shift method is neither value-add nor a growth percentile. In the absence of exit NAPLAN testing, this metric is unable to discriminate academic gain from legitimate ATAR-maximizing strategies. But by controlling for ICSEA, ATAR proportion variation and student mobility, it uncovers progression to ATAR metrics which are not currently publicly available. However achieved, ATAR maximisation is a sought-after private good. So long as standardised nationwide data is available, this analysis offers useful analytics for educators and reasonable predictivity when counselling subsequent cohorts about their ATAR prospects.Keywords: NAPLAN, ATAR, analytics, measurement, gain, performance, data, percentile, value-added, high school, numeracy, reading comprehension, variability, regression to the mean
Procedia PDF Downloads 671728 Segmentation of Arabic Handwritten Numeral Strings Based on Watershed Approach
Authors: Nidal F. Shilbayeh, Remah W. Al-Khatib, Sameer A. Nooh
Abstract:
Arabic offline handwriting recognition systems are considered as one of the most challenging topics. Arabic Handwritten Numeral Strings are used to automate systems that deal with numbers such as postal code, banking account numbers and numbers on car plates. Segmentation of connected numerals is the main bottleneck in the handwritten numeral recognition system. This is in turn can increase the speed and efficiency of the recognition system. In this paper, we proposed algorithms for automatic segmentation and feature extraction of Arabic handwritten numeral strings based on Watershed approach. The algorithms have been designed and implemented to achieve the main goal of segmenting and extracting the string of numeral digits written by hand especially in a courtesy amount of bank checks. The segmentation algorithm partitions the string into multiple regions that can be associated with the properties of one or more criteria. The numeral extraction algorithm extracts the numeral string digits into separated individual digit. Both algorithms for segmentation and feature extraction have been tested successfully and efficiently for all types of numerals.Keywords: handwritten numerals, segmentation, courtesy amount, feature extraction, numeral recognition
Procedia PDF Downloads 3791727 Driver Behavior Analysis and Inter-Vehicular Collision Simulation Approach
Authors: Lu Zhao, Nadir Farhi, Zoi Christoforou, Nadia Haddadou
Abstract:
The safety test of deploying intelligent connected vehicles (ICVs) on the road network is a critical challenge. Road traffic network simulation can be used to test the functionality of ICVs, which is not only time-saving and less energy-consuming but also can create scenarios with car collisions. However, the relationship between different human driver behaviors and the car-collision occurrences has been not understood clearly; meanwhile, the procedure of car-collisions generation in the traffic numerical simulators is not fully integrated. In this paper, we propose an approach to identify specific driver profiles from real driven data; then, we replicate them in numerical traffic simulations with the purpose of generating inter-vehicular collisions. We proposed three profiles: (i) 'aggressive': short time-headway, (ii) 'inattentive': long reaction time, and (iii) 'normal' with intermediate values of reaction time and time-headway. These three driver profiles are extracted from the NGSIM dataset and simulated using the intelligent driver model (IDM), with an extension of reaction time. At last, the generation of inter-vehicular collisions is performed by varying the percentages of different profiles.Keywords: vehicular collisions, human driving behavior, traffic modeling, car-following models, microscopic traffic simulation
Procedia PDF Downloads 1691726 Analysis of Cyber Activities of Potential Business Customers Using Neo4j Graph Databases
Authors: Suglo Tohari Luri
Abstract:
Data analysis is an important aspect of business performance. With the application of artificial intelligence within databases, selecting a suitable database engine for an application design is also very crucial for business data analysis. The application of business intelligence (BI) software into some relational databases such as Neo4j has proved highly effective in terms of customer data analysis. Yet what remains of great concern is the fact that not all business organizations have the neo4j business intelligence software applications to implement for customer data analysis. Further, those with the BI software lack personnel with the requisite expertise to use it effectively with the neo4j database. The purpose of this research is to demonstrate how the Neo4j program code alone can be applied for the analysis of e-commerce website customer visits. As the neo4j database engine is optimized for handling and managing data relationships with the capability of building high performance and scalable systems to handle connected data nodes, it will ensure that business owners who advertise their products at websites using neo4j as a database are able to determine the number of visitors so as to know which products are visited at routine intervals for the necessary decision making. It will also help in knowing the best customer segments in relation to specific goods so as to place more emphasis on their advertisement on the said websites.Keywords: data, engine, intelligence, customer, neo4j, database
Procedia PDF Downloads 1931725 Electromagnetic Wave Propagation Equations in 2D by Finite Difference Method
Authors: N. Fusun Oyman Serteller
Abstract:
In this paper, the techniques to solve time dependent electromagnetic wave propagation equations based on the Finite Difference Method (FDM) are proposed by comparing the results with Finite Element Method (FEM) in 2D while discussing some special simulation examples. Here, 2D dynamical wave equations for lossy media, even with a constant source, are discussed for establishing symbolic manipulation of wave propagation problems. The main objective of this contribution is to introduce a comparative study of two suitable numerical methods and to show that both methods can be applied effectively and efficiently to all types of wave propagation problems, both linear and nonlinear cases, by using symbolic computation. However, the results show that the FDM is more appropriate for solving the nonlinear cases in the symbolic solution. Furthermore, some specific complex domain examples of the comparison of electromagnetic waves equations are considered. Calculations are performed through Mathematica software by making some useful contribution to the programme and leveraging symbolic evaluations of FEM and FDM.Keywords: finite difference method, finite element method, linear-nonlinear PDEs, symbolic computation, wave propagation equations
Procedia PDF Downloads 1451724 Investigating Customer Engagement through the Prism of Congruity Theory
Authors: Jamid Ul Islam, Zillur Rahman
Abstract:
The impulse for customer engagement research in online brand communities (OBCs) is largely acknowledged in the literature. Applying congruity theory, this study proposes a model of customer engagement by examining how two congruities viz. self-brand image congruity and value congruity influence customers’ engagement in online brand communities. The consequent effect of customer engagement on brand loyalty is also studied. This study collected data through a questionnaire survey of 395 students of a higher educational institute in India, who were active on Facebook and followed a brand community (at least one). The data were analyzed using structure equation modelling. The results revealed that both the types of congruity i.e., self-brand image congruity and value congruity significantly affect customer engagement. A positive effect of customer engagement on brand loyalty was also affirmed by the results. This study integrates and broadens extant explanations of different congruity effects on consumer behavior-an area that has received little attention. This study is expected to add new trends to engage customers in online brand communities and offer realistic insights to the domain of social media marketing.Keywords: congruity theory, customer engagement, Facebook, online brand communities
Procedia PDF Downloads 3471723 Bosporus Evolution: Its Role in the Black Sea Forming
Authors: I. V. Kuzminov
Abstract:
The research is dedicated to the issue of Bosporus evolution and its key role in the Black Sea forming. Up till nowadays, there is no distinct picture of the historical and geographical events of the last 10 thousand years on the territory from Altai up to the Alps. The present article is an attempt to clarify and, moreover, link the presented version to the historical and climatic events of this period. The paper is a development of the basic idea stated in "Hypothesis on the Black Sea origin". The succession of events in dynamics is offered in this article. In the article, it is shown that fluctuation of the level of the World Ocean is a mirror of the basic events connected with the climate on the Earth on the one hand and hydraulic processes on the other hand. In the present article, it is come out with the assumption that at the formation of passage, there were some cycles of change in a level of the World ocean. The phase of the beginning of climate warming is characterized by an increase in the level of inland water bodies on the way of meltwater runoff and an increase in the World ocean level. The end of the warming phase is characterized by the continuation of a rise in the level of the World ocean and the drying up of inland water bodies deprived of meltwater replenishment.Keywords: Bosporus, Ryan-Pitman hypothesis, fluctuations of the World Ocean level, the Paratethys Sea, catastrophic breakthrough
Procedia PDF Downloads 1091722 Connecting Students and Faculty Research Efforts through the Research and Projects Portal
Authors: Havish Nalapareddy, Mark V. Albert, Ranak Bansal, Avi Udash, Lin Lin
Abstract:
Students engage in many course projects during their degree programs. However, impactful projects often need a time frame longer than a single semester. Ideally, projects are documented and structured to be readily accessible to future students who may choose to continue the project, with features that emphasize the local community, university, or course structure. The Research and Project Portal (RAPP) is a place where students can post both their completed and ongoing projects with all the resources and tools used. This portal allows students to see what other students have done in the past, in the same university environment, related to their domain of interest. Computer science instructors or students selecting projects can use this portal to assign or choose an incomplete project. Additionally, this portal allows non-computer science faculty and industry collaborators to document their project ideas for students in courses to prototype directly, rather than directly soliciting the help of instructors in engaging students. RAPP serves as a platform linking students across classes and faculty both in and out of computer science courses on joint projects to encourage long-term project efforts across semesters or years.Keywords: education, technology, research, academic portal
Procedia PDF Downloads 1361721 A Hybrid Digital Watermarking Scheme
Authors: Nazish Saleem Abbas, Muhammad Haris Jamil, Hamid Sharif
Abstract:
Digital watermarking is a technique that allows an individual to add and hide secret information, copyright notice, or other verification message inside a digital audio, video, or image. Today, with the advancement of technology, modern healthcare systems manage patients’ diagnostic information in a digital way in many countries. When transmitted between hospitals through the internet, the medical data becomes vulnerable to attacks and requires security and confidentiality. Digital watermarking techniques are used in order to ensure the authenticity, security and management of medical images and related information. This paper proposes a watermarking technique that embeds a watermark in medical images imperceptibly and securely. In this work, digital watermarking on medical images is carried out using the Least Significant Bit (LSB) with the Discrete Cosine Transform (DCT). The proposed methods of embedding and extraction of a watermark in a watermarked image are performed in the frequency domain using LSB by XOR operation. The quality of the watermarked medical image is measured by the Peak signal-to-noise ratio (PSNR). It was observed that the watermarked medical image obtained performing XOR operation between DCT and LSB survived compression attack having a PSNR up to 38.98.Keywords: watermarking, image processing, DCT, LSB, PSNR
Procedia PDF Downloads 461720 SPARK: An Open-Source Knowledge Discovery Platform That Leverages Non-Relational Databases and Massively Parallel Computational Power for Heterogeneous Genomic Datasets
Authors: Thilina Ranaweera, Enes Makalic, John L. Hopper, Adrian Bickerstaffe
Abstract:
Data are the primary asset of biomedical researchers, and the engine for both discovery and research translation. As the volume and complexity of research datasets increase, especially with new technologies such as large single nucleotide polymorphism (SNP) chips, so too does the requirement for software to manage, process and analyze the data. Researchers often need to execute complicated queries and conduct complex analyzes of large-scale datasets. Existing tools to analyze such data, and other types of high-dimensional data, unfortunately suffer from one or more major problems. They typically require a high level of computing expertise, are too simplistic (i.e., do not fit realistic models that allow for complex interactions), are limited by computing power, do not exploit the computing power of large-scale parallel architectures (e.g. supercomputers, GPU clusters etc.), or are limited in the types of analysis available, compounded by the fact that integrating new analysis methods is not straightforward. Solutions to these problems, such as those developed and implemented on parallel architectures, are currently available to only a relatively small portion of medical researchers with access and know-how. The past decade has seen a rapid expansion of data management systems for the medical domain. Much attention has been given to systems that manage phenotype datasets generated by medical studies. The introduction of heterogeneous genomic data for research subjects that reside in these systems has highlighted the need for substantial improvements in software architecture. To address this problem, we have developed SPARK, an enabling and translational system for medical research, leveraging existing high performance computing resources, and analysis techniques currently available or being developed. It builds these into The Ark, an open-source web-based system designed to manage medical data. SPARK provides a next-generation biomedical data management solution that is based upon a novel Micro-Service architecture and Big Data technologies. The system serves to demonstrate the applicability of Micro-Service architectures for the development of high performance computing applications. When applied to high-dimensional medical datasets such as genomic data, relational data management approaches with normalized data structures suffer from unfeasibly high execution times for basic operations such as insert (i.e. importing a GWAS dataset) and the queries that are typical of the genomics research domain. SPARK resolves these problems by incorporating non-relational NoSQL databases that have been driven by the emergence of Big Data. SPARK provides researchers across the world with user-friendly access to state-of-the-art data management and analysis tools while eliminating the need for high-level informatics and programming skills. The system will benefit health and medical research by eliminating the burden of large-scale data management, querying, cleaning, and analysis. SPARK represents a major advancement in genome research technologies, vastly reducing the burden of working with genomic datasets, and enabling cutting edge analysis approaches that have previously been out of reach for many medical researchers.Keywords: biomedical research, genomics, information systems, software
Procedia PDF Downloads 2691719 A Comprehensive Review of Electronic Health Records Implementation in Healthcare
Authors: Lateefat Amao, Misagh Faezipour
Abstract:
Implementing electronic health records (EHR) in healthcare is a pivotal transition aimed at digitizing and optimizing patient health information management. The expectations associated with this transition are high, even towards other health information systems (HIS) and health technology. This multifaceted process involves careful planning and execution to improve the quality and efficiency of patient care, especially as healthcare technology is a sensitive niche. Key considerations include a thorough needs assessment, judicious vendor selection, robust infrastructure development, and training and adaptation of healthcare professionals. Comprehensive training programs, data migration from legacy systems and models, interoperability, as well as security and regulatory compliance are imperative for healthcare staff to navigate EHR systems adeptly. The purpose of this work is to offer a comprehensive review of the literature on EHR implementation. It explores the impact of this health technology on health practices, highlights challenges and barriers to its successful utility, and offers practical strategies that can impact its success in healthcare. This paper provides a thorough review of studies on the adoption of EHRs, emphasizing the wide range of experiences and results connected to EHR use in the medical field, especially across different types of healthcare organizations.Keywords: healthcare, electronic health records, EHR implementation, patient care, interoperability
Procedia PDF Downloads 791718 Experimental and Analytical Study of Various Types of Shear Connector Used for Cold-Formed Steel-Ferrocement Composite Beam
Authors: Talal Alhajri, Mahmood M. Tahir, Khaled Alenezi, Mohamad Ragaee
Abstract:
This work presents the experimental tests carried out to evaluate the behaviour of different types of shear connectors proposed for cold formed steel (CFS) section integrated with ferrocement slab as potential used for composite beam. Ten push-out test specimens of cold-formed steel lipped channel sections connected with ferrocement slab were tested. Three types of shear connectors were studied comprised of bolts, self-drilling-screw and bar angle. The connection behavior is analysed in terms of its load-slip relationship and the failure mode. The parametric studies were performed to investigate the effect on the shear connector’s capacity by varying the number of layers of wire mesh used in ferrocement slab and types of shear connector used. An analytical analysis using ANSYS program and theoretical analysis (Eurocode 4) were carried out to verify the experiment results. The results show that the experimental, theoretical, and numerical values proved to have good agreement with each other.Keywords: cold-formed steel, composite beam, ferrocement, finite element method, push-out test, shear connector
Procedia PDF Downloads 3601717 Sentiment Analysis of Ensemble-Based Classifiers for E-Mail Data
Authors: Muthukumarasamy Govindarajan
Abstract:
Detection of unwanted, unsolicited mails called spam from email is an interesting area of research. It is necessary to evaluate the performance of any new spam classifier using standard data sets. Recently, ensemble-based classifiers have gained popularity in this domain. In this research work, an efficient email filtering approach based on ensemble methods is addressed for developing an accurate and sensitive spam classifier. The proposed approach employs Naive Bayes (NB), Support Vector Machine (SVM) and Genetic Algorithm (GA) as base classifiers along with different ensemble methods. The experimental results show that the ensemble classifier was performing with accuracy greater than individual classifiers, and also hybrid model results are found to be better than the combined models for the e-mail dataset. The proposed ensemble-based classifiers turn out to be good in terms of classification accuracy, which is considered to be an important criterion for building a robust spam classifier.Keywords: accuracy, arcing, bagging, genetic algorithm, Naive Bayes, sentiment mining, support vector machine
Procedia PDF Downloads 1401716 R Data Science for Technology Management
Authors: Sunghae Jun
Abstract:
Technology management (TM) is important issue in a company improving the competitiveness. Among many activities of TM, technology analysis (TA) is important factor, because most decisions for management of technology are decided by the results of TA. TA is to analyze the developed results of target technology using statistics or Delphi. TA based on Delphi is depended on the experts’ domain knowledge, in comparison, TA by statistics and machine learning algorithms use objective data such as patent or paper instead of the experts’ knowledge. Many quantitative TA methods based on statistics and machine learning have been studied, and these have been used for technology forecasting, technological innovation, and management of technology. They applied diverse computing tools and many analytical methods case by case. It is not easy to select the suitable software and statistical method for given TA work. So, in this paper, we propose a methodology for quantitative TA using statistical computing software called R and data science to construct a general framework of TA. From the result of case study, we also show how our methodology is applied to real field. This research contributes to R&D planning and technology valuation in TM areas.Keywords: technology management, R system, R data science, statistics, machine learning
Procedia PDF Downloads 457