Search results for: computational techniques
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 8102

Search results for: computational techniques

8042 Applying Personel Resilence and Emotional Agitation in Occupational, Health and Safety Education and Training

Authors: M. Jayandran

Abstract:

Continual professional development is an important concept for safety professionals to strengthen the knowledge base and to achieve the required qualifications or international memberships in a given time. But the main problems which have observed among most of the safety aspirants are as follows: lack of focus, inferiority complex, superiority complex, lack of interest and lethargy, family and off job stress, health issues, usage of drugs and alcohol, and absenteeism. A HSE trainer should be an expert in soft skills and other stress, emotional handling techniques, so as to manage the above aspirants during training. To do this practice, a trainer has to brainstorm himself of few of the soft skills like personnel resilience, mnemonic techniques, mind healing, and subconscious suggestion techniques by integrating with an emotional intelligence quotient of the aspirants. By adopting these techniques, a trainer can successfully deliver the course and influence the different types of audience to achieve success in training.

Keywords: personnel resilience, mnemonic techniques, mind healing, sub conscious suggestion techniques

Procedia PDF Downloads 279
8041 The Effect of Program Type on Mutation Testing: Comparative Study

Authors: B. Falah, N. E. Abakouy

Abstract:

Due to its high computational cost, mutation testing has been neglected by researchers. Recently, many cost and mutants’ reduction techniques have been developed, improved, and experimented, but few of them has relied the possibility of reducing the cost of mutation testing on the program type of the application under test. This paper is a comparative study between four operators’ selection techniques (mutants sampling, class level operators, method level operators, and all operators’ selection) based on the program code type of each application under test. It aims at finding an alternative approach to reveal the effect of code type on mutation testing score. The result of our experiment shows that the program code type can affect the mutation score and that the programs using polymorphism are best suited to be tested with mutation testing.

Keywords: equivalent mutant, killed mutant, mutation score, mutation testing, program code type, software testing

Procedia PDF Downloads 527
8040 Improvement of the Aerodynamic Behaviour of a Land Rover Discovery 4 in Turbulent Flow Using Computational Fluid Dynamics (CFD)

Authors: Ahmed Al-Saadi, Ali Hassanpour, Tariq Mahmud

Abstract:

The main objective of this study is to investigate ways to reduce the aerodynamic drag coefficient and to increase the stability of the full-size Sport Utility Vehicle using three-dimensional Computational Fluid Dynamics (CFD) simulation. The baseline model in the simulation was the Land Rover Discovery 4. Many aerodynamic devices and external design modifications were used in this study. These reduction aerodynamic techniques were tested individually or in combination to get the best design. All new models have the same capacity and comfort of the baseline model. Uniform freestream velocity of the air at inlet ranging from 28 m/s to 40 m/s was used. ANSYS Fluent software (version 16.0) was used to simulate all models. The drag coefficient obtained from the ANSYS Fluent for the baseline model was validated with experimental data. It is found that the use of modern aerodynamic add-on devices and modifications has a significant effect in reducing the aerodynamic drag coefficient.

Keywords: aerodynamics, RANS, sport utility vehicle, turbulent flow

Procedia PDF Downloads 288
8039 Experimental, Computational Fluid Dynamics and Theoretical Study of Cyclone Performance Based on Inlet Velocity and Particle Loading Rate

Authors: Sakura Ganegama Bogodage, Andrew Yee Tat Leung

Abstract:

This paper describes experimental, Computational Fluid Dynamics (CFD) and theoretical analysis of a cyclone performance, operated 1.0 g/m3 solid loading rate, at two different inlet velocities (5 m/s and 10 m/s). Comparing experimental results with theoretical and CFD simulation results, it is pronounced that the influence of solid in processing flow is significant than expected. Experimental studies based on gas- solid flows of cyclone separators are complicated as they required advanced sensitive measuring techniques, especially flow characteristics. Thus, CFD modelling and theoretical analysis are economical in analyzing cyclone separator performance but detailed clarifications of the application of these in cyclone separator performance evaluation is not yet discussed. The present study shows the limitations of influencing parameters of CFD and theoretical considerations, comparing experimental results and flow characteristics from CFD modelling.

Keywords: cyclone performance, inlet velocity, pressure drop, solid loading rate

Procedia PDF Downloads 202
8038 Comparative Study and Parallel Implementation of Stochastic Models for Pricing of European Options Portfolios using Monte Carlo Methods

Authors: Vinayak Bassi, Rajpreet Singh

Abstract:

Over the years, with the emergence of sophisticated computers and algorithms, finance has been quantified using computational prowess. Asset valuation has been one of the key components of quantitative finance. In fact, it has become one of the embryonic steps in determining risk related to a portfolio, the main goal of quantitative finance. This study comprises a drawing comparison between valuation output generated by two stochastic dynamic models, namely Black-Scholes and Dupire’s bi-dimensionality model. Both of these models are formulated for computing the valuation function for a portfolio of European options using Monte Carlo simulation methods. Although Monte Carlo algorithms have a slower convergence rate than calculus-based simulation techniques (like FDM), they work quite effectively over high-dimensional dynamic models. A fidelity gap is analyzed between the static (historical) and stochastic inputs for a sample portfolio of underlying assets. In order to enhance the performance efficiency of the model, the study emphasized the use of variable reduction methods and customizing random number generators to implement parallelization. An attempt has been made to further implement the Dupire’s model on a GPU to achieve higher computational performance. Furthermore, ideas have been discussed around the performance enhancement and bottleneck identification related to the implementation of options-pricing models on GPUs.

Keywords: monte carlo, stochastic models, computational finance, parallel programming, scientific computing

Procedia PDF Downloads 135
8037 Perspectives of Computational Modeling in Sanskrit Lexicons

Authors: Baldev Ram Khandoliyan, Ram Kishor

Abstract:

India has a classical tradition of Sanskrit Lexicons. Research work has been done on the study of Indian lexicography. India has seen amazing strides in Information and Communication Technology (ICT) applications for Indian languages in general and for Sanskrit in particular. Since Machine Translation from Sanskrit to other Indian languages is often the desired goal, traditional Sanskrit lexicography has attracted a lot of attention from the ICT and Computational Linguistics community. From Nighaŋţu and Nirukta to Amarakośa and Medinīkośa, Sanskrit owns a rich history of lexicography. As these kośas do not follow the same typology or standard in the selection and arrangement of the words and the information related to them, several types of Kośa-styles have emerged in this tradition. The model of a grammar given by Aṣṭādhyāyī is well appreciated by Indian and western linguists and grammarians. But the different models provided by lexicographic tradition also have importance. The general usefulness of Sanskrit traditional Kośas is well discussed by some scholars. That is most of the matter made available in the text. Some also have discussed the good arrangement of lexica. This paper aims to discuss some more use of the different models of Sanskrit lexicography especially focusing on its computational modeling and its use in different computational operations.

Keywords: computational lexicography, Sanskrit Lexicons, nighanṭu, kośa, Amarkosa

Procedia PDF Downloads 133
8036 Quasi–Periodicity of Tonic Intervals in Octave and Innovation of Themes in Music Compositions

Authors: R. C. Tyagi

Abstract:

Quasi-periodicity of frequency intervals observed in Shruti based Absolute Scale of Music has been used to graphically identify the Anchor notes ‘Vadi’ and ‘Samvadi’ which are nodal points for expansion, elaboration and iteration of the emotional theme represented by the characteristic tonic arrangement in Raga compositions. This analysis leads to defining the Tonic parameters in the octave including the key-note frequency, tonic intervals’ anchor notes and the on-set and range of quasi-periodicities as exponents of 2. Such uniformity of representation of characteristic data would facilitate computational analysis and synthesis of music compositions and also help develop noise suppression techniques. Criteria for tuning of strings for compatibility with placement of frets on finger boards is discussed. Natural Rhythmic cycles in music compositions are analytically shown to lie between 3 and 126 beats.

Keywords: absolute scale, anchor notes, computational analysis, frets, innovation, noise suppression, Quasi-periodicity, rhythmic cycle, tonic interval, Shruti

Procedia PDF Downloads 283
8035 Simulation of Photocatalytic Degradation of Rhodamine B in Annular Photocatalytic Reactor

Authors: Jatinder Kumar, Ajay Bansal

Abstract:

Simulation of a photocatalytic reactor helps in understanding the complex behavior of the photocatalytic degradation. Simulation also aids the designing and optimization of the photocatalytic reactor. Lack of simulation strategies is a huge hindrance in the commercialization of the photocatalytic technology. With the increased performance of computational resources, and development of simulation software, computational fluid dynamics (CFD) is becoming an affordable engineering tool to simulate and optimize reactor designs. In the present paper, a CFD (Computational fluid dynamics) model for simulating the performance of an immobilized-titanium dioxide based annular photocatalytic reactor was developed. The computational model integrates hydrodynamics, species mass transport, and chemical reaction kinetics using a commercial CFD code Fluent 6.3.26. The CFD model was based on the intrinsic kinetic parameters determined experimentally in a perfectly mixed batch reactor. Rhodamine B, a complex organic compound, was selected as a test pollutant for photocatalytic degradation. It was observed that CFD could become a valuable tool to understand and improve the photocatalytic systems.

Keywords: simulation, computational fluid dynamics (CFD), annular photocatalytic reactor, titanium dioxide

Procedia PDF Downloads 557
8034 An Overview of Heating and Cooling Techniques Used in Green Buildings

Authors: Umesh Kumar Soni, Suresh Kumar Soni, S. R. Awasthi

Abstract:

Worldwide biggest difficulties are climate change, future availability of fossil fuels, and economical feasibility of renewable energy. They force us to use to a greater extent renewable energy and develop suitable hybrid renewable systems. Building heating/cooling consumes significant amount of energy. It can be conserved by use of proper heating/cooling techniques. This paper reviews and critically analyzes various active, passive and hybrid heating/cooling techniques used in green buildings.

Keywords: natural ventilation, energy conservation, hybrid ventilation techniques, climate change

Procedia PDF Downloads 575
8033 Computational Homogenization of Thin Walled Structures: On the Influence of the Global vs Local Applied Plane Stress Condition

Authors: M. Beusink, E. W. C. Coenen

Abstract:

The increased application of novel structural materials, such as high grade asphalt, concrete and laminated composites, has sparked the need for a better understanding of the often complex, non-linear mechanical behavior of such materials. The effective macroscopic mechanical response is generally dependent on the applied load path. Moreover, it is also significantly influenced by the microstructure of the material, e.g. embedded fibers, voids and/or grain morphology. At present, multiscale techniques are widely adopted to assess micro-macro interactions in a numerically efficient way. Computational homogenization techniques have been successfully applied over a wide range of engineering cases, e.g. cases involving first order and second order continua, thin shells and cohesive zone models. Most of these homogenization methods rely on Representative Volume Elements (RVE), which model the relevant microstructural details in a confined volume. Imposed through kinematical constraints or boundary conditions, a RVE can be subjected to a microscopic load sequence. This provides the RVE's effective stress-strain response, which can serve as constitutive input for macroscale analyses. Simultaneously, such a study of a RVE gives insight into fine scale phenomena such as microstructural damage and its evolution. It has been reported by several authors that the type of boundary conditions applied to the RVE affect the resulting homogenized stress-strain response. As a consequence, dedicated boundary conditions have been proposed to appropriately deal with this concern. For the specific case of a planar assumption for the analyzed structure, e.g. plane strain, axisymmetric or plane stress, this assumption needs to be addressed consistently in all considered scales. Although in many multiscale studies a planar condition has been employed, the related impact on the multiscale solution has not been explicitly investigated. This work therefore focuses on the influence of the planar assumption for multiscale modeling. In particular the plane stress case is highlighted, by proposing three different implementation strategies which are compatible with a first-order computational homogenization framework. The first method consists of applying classical plane stress theory at the microscale, whereas with the second method a generalized plane stress condition is assumed at the RVE level. For the third method, the plane stress condition is applied at the macroscale by requiring that the resulting macroscopic out-of-plane forces are equal to zero. These strategies are assessed through a numerical study of a thin walled structure and the resulting effective macroscale stress-strain response is compared. It is shown that there is a clear influence of the length scale at which the planar condition is applied.

Keywords: first-order computational homogenization, planar analysis, multiscale, microstrucutures

Procedia PDF Downloads 205
8032 Artificial Intelligence for Generative Modelling

Authors: Shryas Bhurat, Aryan Vashistha, Sampreet Dinakar Nayak, Ayush Gupta

Abstract:

As the technology is advancing more towards high computational resources, there is a paradigm shift in the usage of these resources to optimize the design process. This paper discusses the usage of ‘Generative Design using Artificial Intelligence’ to build better models that adapt the operations like selection, mutation, and crossover to generate results. The human mind thinks of the simplest approach while designing an object, but the intelligence learns from the past & designs the complex optimized CAD Models. Generative Design takes the boundary conditions and comes up with multiple solutions with iterations to come up with a sturdy design with the most optimal parameter that is given, saving huge amounts of time & resources. The new production techniques that are at our disposal allow us to use additive manufacturing, 3D printing, and other innovative manufacturing techniques to save resources and design artistically engineered CAD Models. Also, this paper discusses the Genetic Algorithm, the Non-Domination technique to choose the right results using biomimicry that has evolved for current habitation for millions of years. The computer uses parametric models to generate newer models using an iterative approach & uses cloud computing to store these iterative designs. The later part of the paper compares the topology optimization technology with Generative Design that is previously being used to generate CAD Models. Finally, this paper shows the performance of algorithms and how these algorithms help in designing resource-efficient models.

Keywords: genetic algorithm, bio mimicry, generative modeling, non-dominant techniques

Procedia PDF Downloads 122
8031 Developing New Algorithm and Its Application on Optimal Control of Pumps in Water Distribution Network

Authors: R. Rajabpour, N. Talebbeydokhti, M. H. Ahmadi

Abstract:

In recent years, new techniques for solving complex problems in engineering are proposed. One of these techniques is JPSO algorithm. With innovative changes in the nature of the jump algorithm JPSO, it is possible to construct a graph-based solution with a new algorithm called G-JPSO. In this paper, a new algorithm to solve the optimal control problem Fletcher-Powell and optimal control of pumps in water distribution network was evaluated. Optimal control of pumps comprise of optimum timetable operation (status on and off) for each of the pumps at the desired time interval. Maximum number of status on and off for each pumps imposed to the objective function as another constraint. To determine the optimal operation of pumps, a model-based optimization-simulation algorithm was developed based on G-JPSO and JPSO algorithms. The proposed algorithm results were compared well with the ant colony algorithm, genetic and JPSO results. This shows the robustness of proposed algorithm in finding near optimum solutions with reasonable computational cost.

Keywords: G-JPSO, operation, optimization, pumping station, water distribution networks

Procedia PDF Downloads 375
8030 Prediction of Compressive Strength Using Artificial Neural Network

Authors: Vijay Pal Singh, Yogesh Chandra Kotiyal

Abstract:

Structures are a combination of various load carrying members which transfer the loads to the foundation from the superstructure safely. At the design stage, the loading of the structure is defined and appropriate material choices are made based upon their properties, mainly related to strength. The strength of materials kept on reducing with time because of many factors like environmental exposure and deformation caused by unpredictable external loads. Hence, to predict the strength of materials used in structures, various techniques are used. Among these techniques, Non-Destructive Techniques (NDT) are the one that can be used to predict the strength without damaging the structure. In the present study, the compressive strength of concrete has been predicted using Artificial Neural Network (ANN). The predicted strength was compared with the experimentally obtained actual compressive strength of concrete and equations were developed for different models. A good co-relation has been obtained between the predicted strength by these models and experimental values. Further, the co-relation has been developed using two NDT techniques for prediction of strength by regression analysis. It was found that the percentage error has been reduced between the predicted strength by using combined techniques in place of single techniques.

Keywords: rebound, ultra-sonic pulse, penetration, ANN, NDT, regression

Procedia PDF Downloads 400
8029 Evaluating Machine Learning Techniques for Activity Classification in Smart Home Environments

Authors: Talal Alshammari, Nasser Alshammari, Mohamed Sedky, Chris Howard

Abstract:

With the widespread adoption of the Internet-connected devices, and with the prevalence of the Internet of Things (IoT) applications, there is an increased interest in machine learning techniques that can provide useful and interesting services in the smart home domain. The areas that machine learning techniques can help advance are varied and ever-evolving. Classifying smart home inhabitants’ Activities of Daily Living (ADLs), is one prominent example. The ability of machine learning technique to find meaningful spatio-temporal relations of high-dimensional data is an important requirement as well. This paper presents a comparative evaluation of state-of-the-art machine learning techniques to classify ADLs in the smart home domain. Forty-two synthetic datasets and two real-world datasets with multiple inhabitants are used to evaluate and compare the performance of the identified machine learning techniques. Our results show significant performance differences between the evaluated techniques. Such as AdaBoost, Cortical Learning Algorithm (CLA), Decision Trees, Hidden Markov Model (HMM), Multi-layer Perceptron (MLP), Structured Perceptron and Support Vector Machines (SVM). Overall, neural network based techniques have shown superiority over the other tested techniques.

Keywords: activities of daily living, classification, internet of things, machine learning, prediction, smart home

Procedia PDF Downloads 317
8028 All-or-None Principle and Weakness of Hodgkin-Huxley Mathematical Model

Authors: S. A. Sadegh Zadeh, C. Kambhampati

Abstract:

Mathematical and computational modellings are the necessary tools for reviewing, analysing, and predicting processes and events in the wide spectrum range of scientific fields. Therefore, in a field as rapidly developing as neuroscience, the combination of these two modellings can have a significant role in helping to guide the direction the field takes. The paper combined mathematical and computational modelling to prove a weakness in a very precious model in neuroscience. This paper is intended to analyse all-or-none principle in Hodgkin-Huxley mathematical model. By implementation the computational model of Hodgkin-Huxley model and applying the concept of all-or-none principle, an investigation on this mathematical model has been performed. The results clearly showed that the mathematical model of Hodgkin-Huxley does not observe this fundamental law in neurophysiology to generating action potentials. This study shows that further mathematical studies on the Hodgkin-Huxley model are needed in order to create a model without this weakness.

Keywords: all-or-none, computational modelling, mathematical model, transmembrane voltage, action potential

Procedia PDF Downloads 587
8027 Information Visualization Methods Applied to Nanostructured Biosensors

Authors: Osvaldo N. Oliveira Jr.

Abstract:

The control of molecular architecture inherent in some experimental methods to produce nanostructured films has had great impact on devices of various types, including sensors and biosensors. The self-assembly monolayers (SAMs) and the electrostatic layer-by-layer (LbL) techniques, for example, are now routinely used to produce tailored architectures for biosensing where biomolecules are immobilized with long-lasting preserved activity. Enzymes, antigens, antibodies, peptides and many other molecules serve as the molecular recognition elements for detecting an equally wide variety of analytes. The principles of detection are also varied, including electrochemical methods, fluorescence spectroscopy and impedance spectroscopy. In this presentation an overview will be provided of biosensors made with nanostructured films to detect antibodies associated with tropical diseases and HIV, in addition to detection of analytes of medical interest such as cholesterol and triglycerides. Because large amounts of data are generated in the biosensing experiments, use has been made of computational and statistical methods to optimize performance. Multidimensional projection techniques such as Sammon´s mapping have been shown more efficient than traditional multivariate statistical analysis in identifying small concentrations of anti-HIV antibodies and for distinguishing between blood serum samples of animals infected with two tropical diseases, namely Chagas´ disease and Leishmaniasis. Optimization of biosensing may include a combination of another information visualization method, the Parallel Coordinate technique, with artificial intelligence methods in order to identify the most suitable frequencies for reaching higher sensitivity using impedance spectroscopy. Also discussed will be the possible convergence of technologies, through which machine learning and other computational methods may be used to treat data from biosensors within an expert system for clinical diagnosis.

Keywords: clinical diagnosis, information visualization, nanostructured films, layer-by-layer technique

Procedia PDF Downloads 306
8026 General Purpose Graphic Processing Units Based Real Time Video Tracking System

Authors: Mallikarjuna Rao Gundavarapu, Ch. Mallikarjuna Rao, K. Anuradha Bai

Abstract:

Real Time Video Tracking is a challenging task for computing professionals. The performance of video tracking techniques is greatly affected by background detection and elimination process. Local regions of the image frame contain vital information of background and foreground. However, pixel-level processing of local regions consumes a good amount of computational time and memory space by traditional approaches. In our approach we have explored the concurrent computational ability of General Purpose Graphic Processing Units (GPGPU) to address this problem. The Gaussian Mixture Model (GMM) with adaptive weighted kernels is used for detecting the background. The weights of the kernel are influenced by local regions and are updated by inter-frame variations of these corresponding regions. The proposed system has been tested with GPU devices such as GeForce GTX 280, GeForce GTX 280 and Quadro K2000. The results are encouraging with maximum speed up 10X compared to sequential approach.

Keywords: connected components, embrace threads, local weighted kernel, structuring elements

Procedia PDF Downloads 412
8025 Development of Computational Approach for Calculation of Hydrogen Solubility in Hydrocarbons for Treatment of Petroleum

Authors: Abdulrahman Sumayli, Saad M. AlShahrani

Abstract:

For the hydrogenation process, knowing the solubility of hydrogen (H2) in hydrocarbons is critical to improve the efficiency of the process. We investigated the H2 solubility computation in four heavy crude oil feedstocks using machine learning techniques. Temperature, pressure, and feedstock type were considered as the inputs to the models, while the hydrogen solubility was the sole response. Specifically, we employed three different models: Support Vector Regression (SVR), Gaussian process regression (GPR), and Bayesian ridge regression (BRR). To achieve the best performance, the hyper-parameters of these models are optimized using the whale optimization algorithm (WOA). We evaluated the models using a dataset of solubility measurements in various feedstocks, and we compared their performance based on several metrics. Our results show that the WOA-SVR model tuned with WOA achieves the best performance overall, with an RMSE of 1.38 × 10− 2 and an R-squared of 0.991. These findings suggest that machine learning techniques can provide accurate predictions of hydrogen solubility in different feedstocks, which could be useful in the development of hydrogen-related technologies. Besides, the solubility of hydrogen in the four heavy oil fractions is estimated in different ranges of temperatures and pressures of 150 ◦C–350 ◦C and 1.2 MPa–10.8 MPa, respectively

Keywords: temperature, pressure variations, machine learning, oil treatment

Procedia PDF Downloads 44
8024 CFD Study on the Effect of Primary Air on Combustion of Simulated MSW Process in the Fixed Bed

Authors: Rui Sun, Tamer M. Ismail, Xiaohan Ren, M. Abd El-Salam

Abstract:

Incineration of municipal solid waste (MSW) is one of the key scopes in the global clean energy strategy. A computational fluid dynamics (CFD) model was established. In order to reveal these features of the combustion process in a fixed porous bed of MSW. Transporting equations and process rate equations of the waste bed were modeled and set up to describe the incineration process, according to the local thermal conditions and waste property characters. Gas phase turbulence was modeled using k-ε turbulent model and the particle phase was modeled using the kinetic theory of granular flow. The heterogeneous reaction rates were determined using Arrhenius eddy dissipation and the Arrhenius-diffusion reaction rates. The effects of primary air flow rate and temperature in the burning process of simulated MSW are investigated experimentally and numerically. The simulation results in bed are accordant with experimental data well. The model provides detailed information on burning processes in the fixed bed, which is otherwise very difficult to obtain by conventional experimental techniques.

Keywords: computational fluid dynamics (CFD) model, waste incineration, municipal solid waste (MSW), fixed bed, primary air

Procedia PDF Downloads 381
8023 Aerodynamic Heating Analysis of Hypersonic Flow over Blunt-Nosed Bodies Using Computational Fluid Dynamics

Authors: Aakash Chhunchha, Assma Begum

Abstract:

The qualitative aspects of hypersonic flow over a range of blunt bodies have been extensively analyzed in the past. It is well known that the curvature of a body’s geometry in the sonic region predominantly dictates the bow shock shape and its standoff distance from the body, while the surface pressure distribution depends on both the sonic region and on the local body shape. The present study is an extension to analyze the hypersonic flow characteristics over several blunt-nosed bodies using modern Computational Fluid Dynamics (CFD) tools to determine the shock shape and its effect on the heat flux around the body. 4 blunt-nosed models with cylindrical afterbodies were analyzed for a flow at a Mach number of 10 corresponding to the standard atmospheric conditions at an altitude of 50 km. The nose radii of curvature of the models range from a hemispherical nose to a flat nose. Appropriate numerical models and the supplementary convergence techniques that were implemented for the CFD analysis are thoroughly described. The flow contours are presented highlighting the key characteristics of shock wave shape, shock standoff distance and the sonic point shift on the shock. The variation of heat flux, due to different shock detachments for various models is comprehensively discussed. It is observed that the more the bluntness of the nose radii, the farther the shock stands from the body; and consequently, the less the surface heating at the nose. The results obtained from the CFD analyses are compared with approximated theoretical engineering correlations. Overall, a satisfactory agreement is observed between the two.

Keywords: aero-thermodynamics, blunt-nosed bodies, computational fluid dynamics (CFD), hypersonic flow

Procedia PDF Downloads 120
8022 Data Centers’ Temperature Profile Simulation Optimized by Finite Elements and Discretization Methods

Authors: José Alberto García Fernández, Zhimin Du, Xinqiao Jin

Abstract:

Nowadays, data center industry faces strong challenges for increasing the speed and data processing capacities while at the same time is trying to keep their devices a suitable working temperature without penalizing that capacity. Consequently, the cooling systems of this kind of facilities use a large amount of energy to dissipate the heat generated inside the servers, and developing new cooling techniques or perfecting those already existing would be a great advance in this type of industry. The installation of a temperature sensor matrix distributed in the structure of each server would provide the necessary information for collecting the required data for obtaining a temperature profile instantly inside them. However, the number of temperature probes required to obtain the temperature profiles with sufficient accuracy is very high and expensive. Therefore, other less intrusive techniques are employed where each point that characterizes the server temperature profile is obtained by solving differential equations through simulation methods, simplifying data collection techniques but increasing the time to obtain results. In order to reduce these calculation times, complicated and slow computational fluid dynamics simulations are replaced by simpler and faster finite element method simulations which solve the Burgers‘ equations by backward, forward and central discretization techniques after simplifying the energy and enthalpy conservation differential equations. The discretization methods employed for solving the first and second order derivatives of the obtained Burgers‘ equation after these simplifications are the key for obtaining results with greater or lesser accuracy regardless of the characteristic truncation error.

Keywords: Burgers' equations, CFD simulation, data center, discretization methods, FEM simulation, temperature profile

Procedia PDF Downloads 134
8021 Bio-Guided of Active New Alkaloids from Alstonia Brassi Toxicity Antitumour Activity in Silico and Molecular Modeling

Authors: Mesbah Khaled, Bouraoui Ouissal, Benkiniouar Rachid, Belkhiri Lotfi

Abstract:

Alstonia, which are tropical plants with a wide geographical distribution, have been divided into different sections by different authors based on previous studies of several species within the genus. Monachino divides Alstonia into 5 sections, while Pichon divides it into 3 sections. Several plants belonging to this genus, such as Alstonia brassii, have been used in traditional folk medicine to treat ailments such as fever, malaria and dysentery]. Previous studies focusing on the chemical composition of these plants have successfully identified indol alkaloids with cytotoxic, anti-diabetic and anti-inflammatory properties. The newly discovered monomers are structurally similar to the backbones of picralin, affinisin and macrolin. On the other hand, all recently isolated dimeric compounds have a macrolin moiety. In this study, a computational analysis was performed on a series of novel molecules, including both monomeric and dimeric compounds with different structural frameworks. This investigation represents the first computational study of these molecules using an in silico approach incorporating 2D-QSAR data. The analysis involved various computational techniques, including 2D-QSAR modelling, molecular docking studies and subsequent validation by molecular dynamics simulation and assessment of ADMET properties. The chemical composition was identified by 1D and 2D NMR. Eight new alkaloids were isolated, 5 monomers and 3 dimers. In this section, we focus on the biological activity of 4 new alkaloids belonging to two different skeletons, the affinisine skeleton.

Keywords: affinisine, talcarpine, macroline, cytotoxicity, alkaloids

Procedia PDF Downloads 216
8020 A Comparative Study between Different Techniques of Off-Page and On-Page Search Engine Optimization

Authors: Ahmed Ishtiaq, Maeeda Khalid, Umair Sajjad

Abstract:

In the fast-moving world, information is the key to success. If information is easily available, then it makes work easy. The Internet is the biggest collection and source of information nowadays, and with every single day, the data on internet increases, and it becomes difficult to find required data. Everyone wants to make his/her website at the top of search results. This can be possible when you have applied some techniques of SEO inside your application or outside your application, which are two types of SEO, onsite and offsite SEO. SEO is an abbreviation of Search Engine Optimization, and it is a set of techniques, methods to increase users of a website on World Wide Web or to rank up your website in search engine indexing. In this paper, we have compared different techniques of Onpage and Offpage SEO, and we have suggested many things that should be changed inside webpage, outside web page and mentioned some most powerful and search engine considerable elements and techniques in both types of SEO in order to gain high ranking on Search Engine.

Keywords: auto-suggestion, search engine optimization, SEO, query, web mining, web crawler

Procedia PDF Downloads 120
8019 A Comparative Study of Virus Detection Techniques

Authors: Sulaiman Al amro, Ali Alkhalifah

Abstract:

The growing number of computer viruses and the detection of zero day malware have been the concern for security researchers for a large period of time. Existing antivirus products (AVs) rely on detecting virus signatures which do not provide a full solution to the problems associated with these viruses. The use of logic formulae to model the behaviour of viruses is one of the most encouraging recent developments in virus research, which provides alternatives to classic virus detection methods. In this paper, we proposed a comparative study about different virus detection techniques. This paper provides the advantages and drawbacks of different detection techniques. Different techniques will be used in this paper to provide a discussion about what technique is more effective to detect computer viruses.

Keywords: computer viruses, virus detection, signature-based, behaviour-based, heuristic-based

Procedia PDF Downloads 446
8018 Performativity and Valuation Techniques: Evidence from Investment Banks in the Wake of the Global Financial Crisis

Authors: Alicja Reuben, Amira Annabi

Abstract:

In this paper, we explore the relationship between the selection of valuation techniques by investment banks and the banks’ risk perceptions and performance in the context of the theory of performativity. We use inferential statistics to study these relationships by building a unique dataset based on the disclosure of 12 investment banks’ 2012-2015 annual financial statements. Moreover, we create two constructs, namely intensity of use and risk perception. We measure the intensity of use as a frequency metric of how often a particular bank adopts valuation techniques for a particular asset or liability. We measure risk perception based on disclosed ranges of values for unobservable inputs. Our results are twofold: we find a significant negative correlation between (1) intensity of use and investment bank performance and (2) intensity of use and risk perception. These results indicate that a performative process takes place, and the valuation techniques are enacting their environment.

Keywords: language, linguistics, performativity, financial techniques

Procedia PDF Downloads 135
8017 Using the Cluster Computing to Improve the Computational Speed of the Modular Exponentiation in RSA Cryptography System

Authors: Te-Jen Chang, Ping-Sheng Huang, Shan-Ten Cheng, Chih-Lin Lin, I-Hui Pan, Tsung- Hsien Lin

Abstract:

RSA system is a great contribution for the encryption and the decryption. It is based on the modular exponentiation. We call this system as “a large of numbers for calculation”. The operation of a large of numbers is a very heavy burden for CPU. For increasing the computational speed, in addition to improve these algorithms, such as the binary method, the sliding window method, the addition chain method, and so on, the cluster computer can be used to advance computational speed. The cluster system is composed of the computers which are installed the MPICH2 in laboratory. The parallel procedures of the modular exponentiation can be processed by combining the sliding window method with the addition chain method. It will significantly reduce the computational time of the modular exponentiation whose digits are more than 512 bits and even more than 1024 bits.

Keywords: cluster system, modular exponentiation, sliding window, addition chain

Procedia PDF Downloads 499
8016 Contextual Distribution for Textual Alignment

Authors: Yuri Bizzoni, Marianne Reboul

Abstract:

Our program compares French and Italian translations of Homer’s Odyssey, from the XVIth to the XXth century. We focus on the third point, showing how distributional semantics systems can be used both to improve alignment between different French translations as well as between the Greek text and a French translation. Although we focus on French examples, the techniques we display are completely language independent.

Keywords: classical receptions, computational linguistics, distributional semantics, Homeric poems, machine translation, translation studies, text alignment

Procedia PDF Downloads 410
8015 A Comparative Study of Particle Image Velocimetry (PIV) and Particle Tracking Velocimetry (PTV) for Airflow Measurement

Authors: Sijie Fu, Pascal-Henry Biwolé, Christian Mathis

Abstract:

Among modern airflow measurement methods, Particle Image Velocimetry (PIV) and Particle Tracking Velocimetry (PTV), as visualized and non-instructive measurement techniques, are playing more important role. This paper conducts a comparative experimental study for airflow measurement employing both techniques with the same condition. Velocity vector fields, velocity contour fields, voticity profiles and turbulence profiles are selected as the comparison indexes. The results show that the performance of both PIV and PTV techniques for airflow measurement is satisfied, but some differences between the both techniques are existed, it suggests that selecting the measurement technique should be based on a comprehensive consideration.

Keywords: airflow measurement, comparison, PIV, PTV

Procedia PDF Downloads 393
8014 Effect of Self-Compassion Techniques for Individuals with Depression: A Pilot Study

Authors: Piyanud Chompookard

Abstract:

This research aims to study the effect of self-compassion techniques for individuals with depression (A pilot study). A quasi-experimental research with pretest-posttest is used to design this work. The research includes 30 participants, divided into the experimental group (ten samples) and the control group (twenty samples). The experimental group received a self-compassion techniques with an appropriate treatment for a total six times. The control group received an appropriate treatment. The measurement of this study using the Hamilton Rating Scale for Depression (Thai version). There are significant differences in levels of depression after received a self-compassion techniques with an appropriate treatment (p<.01). And there are significant differences in levels of depression between the experimental group and the control group (p<.01).

Keywords: depression, self compassion techniques, psychotherapy, pilot study

Procedia PDF Downloads 112
8013 Overview of Time, Resource and Cost Planning Techniques in Construction Management Research

Authors: R. Gupta, P. Jain, S. Das

Abstract:

One way to approach construction scheduling optimization problem is to focus on the individual aspects of planning, which can be broadly classified as time scheduling, crew and resource management, and cost control. During the last four decades, construction planning has seen a lot of research, but to date, no paper had attempted to summarize the literature available under important heads. This paper addresses each of aspects separately, and presents the findings of an in-depth literature of the various planning techniques. For techniques dealing with time scheduling, the authors have adopted a rough chronological documentation. For crew and resource management, classification has been done on the basis of the different steps involved in the resource planning process. For cost control, techniques dealing with both estimation of costs and the subsequent optimization of costs have been dealt with separately.

Keywords: construction planning techniques, time scheduling, resource planning, cost control

Procedia PDF Downloads 457