Search results for: physical function
9016 An Evolutionary Approach for Automated Optimization and Design of Vivaldi Antennas
Authors: Sahithi Yarlagadda
Abstract:
The design of antenna is constrained by mathematical and geometrical parameters. Though there are diverse antenna structures with wide range of feeds yet, there are many geometries to be tried, which cannot be customized into predefined computational methods. The antenna design and optimization qualify to apply evolutionary algorithmic approach since the antenna parameters weights dependent on geometric characteristics directly. The evolutionary algorithm can be explained simply for a given quality function to be maximized. We can randomly create a set of candidate solutions, elements of the function's domain, and apply the quality function as an abstract fitness measure. Based on this fitness, some of the better candidates are chosen to seed the next generation by applying recombination and permutation to them. In conventional approach, the quality function is unaltered for any iteration. But the antenna parameters and geometries are wide to fit into single function. So, the weight coefficients are obtained for all possible antenna electrical parameters and geometries; the variation is learnt by mining the data obtained for an optimized algorithm. The weight and covariant coefficients of corresponding parameters are logged for learning and future use as datasets. This paper drafts an approach to obtain the requirements to study and methodize the evolutionary approach to automated antenna design for our past work on Vivaldi antenna as test candidate. The antenna parameters like gain, directivity, etc. are directly caged by geometries, materials, and dimensions. The design equations are to be noted here and valuated for all possible conditions to get maxima and minima for given frequency band. The boundary conditions are thus obtained prior to implementation, easing the optimization. The implementation mainly aimed to study the practical computational, processing, and design complexities that incur while simulations. HFSS is chosen for simulations and results. MATLAB is used to generate the computations, combinations, and data logging. MATLAB is also used to apply machine learning algorithms and plotting the data to design the algorithm. The number of combinations is to be tested manually, so HFSS API is used to call HFSS functions from MATLAB itself. MATLAB parallel processing tool box is used to run multiple simulations in parallel. The aim is to develop an add-in to antenna design software like HFSS, CSTor, a standalone application to optimize pre-identified common parameters of wide range of antennas available. In this paper, we have used MATLAB to calculate Vivaldi antenna parameters like slot line characteristic impedance, impedance of stripline, slot line width, flare aperture size, dielectric and K means, and Hamming window are applied to obtain the best test parameters. HFSS API is used to calculate the radiation, bandwidth, directivity, and efficiency, and data is logged for applying the Evolutionary genetic algorithm in MATLAB. The paper demonstrates the computational weights and Machine Learning approach for automated antenna optimizing for Vivaldi antenna.Keywords: machine learning, Vivaldi, evolutionary algorithm, genetic algorithm
Procedia PDF Downloads 1109015 Role of Zinc Adminstration in Improvement of Faltering Growth in Egyption Children at Risk of Environmental Enteric Dysfunction
Authors: Ghada Mahmoud El Kassas, Maged Atta El Wakeel
Abstract:
Background: Environmental enteric dysfunction (EED) is impending trouble that flared up in the last decades to be pervasive in infants and children. EED is asymptomatic villous atrophy of the small bowel that is prevalent in the developing world and is associated with altered intestinal function and integrity. Evidence has suggested that supplementary zinc might ameliorate this damage by reducing gastrointestinal inflammation and may also benefit cognitive development. Objective: We tested whether zinc supplementation improves intestinal integrity, growth, and cognitive function in stunted children predicted to have EED. Methodology: This case–control prospective interventional study was conducted on 120 Egyptian Stunted children aged 1-10 years who recruited from the Nutrition clinic, the National research center, and 100 age and gender-matched healthy children as controls. At the primary phase of the study, Full history taking, clinical examination, and anthropometric measurements were done. Standard deviation score (SDS) for all measurements were calculated. Serum markers as Zonulin, Endotoxin core antibody (EndoCab), highly sensitive C-reactive protein (hsCRP), alpha1-acid glycoprotein (AGP), Tumor necrosis factor (TNF), and fecal markers such as myeloperoxidase (MPO), neopterin (NEO), and alpha-1-anti-trypsin (AAT) (as predictors of EED) were measured. Cognitive development was assessed (Bayley or Wechsler scores). Oral zinc at a dosage of 20 mg/d was supplemented to all cases and followed up for 6 months, after which the 2ry phase of the study included the previous clinical, laboratory, and cognitive assessment. Results: Serum and fecal inflammatory markers were significantly higher in cases compared to controls. Zonulin (P < 0.01), (EndoCab) (P < 0.001) and (AGP) (P < 0.03) markedly decreased in cases at the end of 2ry phase. Also (MPO), (NEO), and (AAT) showed a significant decline in cases at the end of the study (P < 0.001 for all). A significant increase in mid-upper arm circumference (MUAC) (P < 0.01), weight for age z-score, and skinfold thicknesses (P< 0.05 for both) was detected at end of the study, while height was not significantly affected. Cases also showed significant improvement of cognitive function at phase 2 of the study. Conclusion: Intestinal inflammatory state related to EED showed marked recovery after zinc supplementation. As a result, anthropometric and cognitive parameters showed obvious improvement with zinc supplementation.Keywords: stunting, cognitive function, environmental enteric dysfunction, zinc
Procedia PDF Downloads 1909014 Shuffled Structure for 4.225 GHz Antireflective Plates: A Proposal Proven by Numerical Simulation
Authors: Shin-Ku Lee, Ming-Tsu Ho
Abstract:
A newly proposed antireflective selector with shuffled structure is reported in this paper. The proposed idea is made of two different quarter wavelength (QW) slabs and numerically supported by the one-dimensional simulation results provided by the method of characteristics (MOC) to function as an antireflective selector. These two QW slabs are characterized by dielectric constants εᵣA and εᵣB, uniformly divided into N and N+1 pieces respectively which are then shuffled to form an antireflective plate with B(AB)N structure such that there is always one εᵣA piece between two εᵣB pieces. Another is A(BA)N structure where every εᵣB piece is sandwiched by two εᵣA pieces. Both proposed structures are numerically proved to function as QW plates. In order to allow maximum transmission through the proposed structures, the two dielectric constants are chosen to have the relation of (εᵣA)² = εᵣB > 1. The advantages of the proposed structures over the traditional anti-reflection coating techniques are two components with two thicknesses and to shuffle to form new QW structures. The design wavelength used to validate the proposed idea is 71 mm corresponding to a frequency about 4.225 GHz. The computational results are shown in both time and frequency domains revealing that the proposed structures produce minimum reflections around the frequency of interest.Keywords: method of characteristics, quarter wavelength, anti-reflective plate, propagation of electromagnetic fields
Procedia PDF Downloads 1469013 The Co-Simulation Interface SystemC/Matlab Applied in JPEG and SDR Application
Authors: Walid Hassairi, Moncef Bousselmi, Mohamed Abid
Abstract:
Functional verification is a major part of today’s system design task. Several approaches are available for verification on a high abstraction level, where designs are often modeled using MATLAB/Simulink. However, different approaches are a barrier to a unified verification flow. In this paper, we propose a co-simulation interface between SystemC and MATLAB and Simulink to enable functional verification of multi-abstraction levels designs. The resulting verification flow is tested on JPEG compression algorithm. The required synchronization of both simulation environments, as well as data type conversion is solved using the proposed co-simulation flow. We divided into two encoder jpeg parts. First implemented in SystemC which is the DCT is representing the HW part. Second, consisted of quantization and entropy encoding which is implemented in Matlab is the SW part. For communication and synchronization between these two parts we use S-Function and engine in Simulink matlab. With this research premise, this study introduces a new implementation of a Hardware SystemC of DCT. We compare the result of our simulation compared to SW / SW. We observe a reduction in simulation time you have 88.15% in JPEG and the design efficiency of the supply design is 90% in SDR.Keywords: hardware/software, co-design, co-simulation, systemc, matlab, s-function, communication, synchronization
Procedia PDF Downloads 4059012 Integrating Neural Linguistic Programming with Exergaming
Authors: Shyam Sajan, Kamal Bijlani
Abstract:
The widespread effects of digital media help people to explore the world more and get entertained with no effort. People became fond of these kind of sedentary life style. The increase in sedentary time and a decrease in physical activities has negative impacts on human health. Even though the addiction to video games has been exploited in exergames, to make people exercise and enjoy game challenges, the contribution is restricted only to physical wellness. This paper proposes creation and implementation of a game with the help of digital media in a virtual environment. The game is designed by collaborating ideas from neural linguistic programming and Stroop effect that can also be used to identify a person’s mental state, to improve concentration and to eliminate various phobias. The multiplayer game is played in a virtual environment created with Kinect sensor, to make the game more motivating and interactive.Keywords: exergaming, Kinect Sensor, Neural Linguistic Programming, Stroop Effect
Procedia PDF Downloads 4369011 Constructing the Joint Mean-Variance Regions for Univariate and Bivariate Normal Distributions: Approach Based on the Measure of Cumulative Distribution Functions
Authors: Valerii Dashuk
Abstract:
The usage of the confidence intervals in economics and econometrics is widespread. To be able to investigate a random variable more thoroughly, joint tests are applied. One of such examples is joint mean-variance test. A new approach for testing such hypotheses and constructing confidence sets is introduced. Exploring both the value of the random variable and its deviation with the help of this technique allows checking simultaneously the shift and the probability of that shift (i.e., portfolio risks). Another application is based on the normal distribution, which is fully defined by mean and variance, therefore could be tested using the introduced approach. This method is based on the difference of probability density functions. The starting point is two sets of normal distribution parameters that should be compared (whether they may be considered as identical with given significance level). Then the absolute difference in probabilities at each 'point' of the domain of these distributions is calculated. This measure is transformed to a function of cumulative distribution functions and compared to the critical values. Critical values table was designed from the simulations. The approach was compared with the other techniques for the univariate case. It differs qualitatively and quantitatively in easiness of implementation, computation speed, accuracy of the critical region (theoretical vs. real significance level). Stable results when working with outliers and non-normal distributions, as well as scaling possibilities, are also strong sides of the method. The main advantage of this approach is the possibility to extend it to infinite-dimension case, which was not possible in the most of the previous works. At the moment expansion to 2-dimensional state is done and it allows to test jointly up to 5 parameters. Therefore the derived technique is equivalent to classic tests in standard situations but gives more efficient alternatives in nonstandard problems and on big amounts of data.Keywords: confidence set, cumulative distribution function, hypotheses testing, normal distribution, probability density function
Procedia PDF Downloads 1749010 A Study of Topical and Similarity of Sebum Layer Using Interactive Technology in Image Narratives
Authors: Chao Wang
Abstract:
Under rapid innovation of information technology, the media plays a very important role in the dissemination of information, and it has a totally different analogy generations face. However, the involvement of narrative images provides more possibilities of narrative text. "Images" through the process of aperture, a camera shutter and developable photosensitive processes are manufactured, recorded and stamped on paper, displayed on a computer screen-concretely saved. They exist in different forms of files, data, or evidence as the ultimate looks of events. By the interface of media and network platforms and special visual field of the viewer, class body space exists and extends out as thin as sebum layer, extremely soft and delicate with real full tension. The physical space of sebum layer of confuses the fact that physical objects exist, needs to be established under a perceived consensus. As at the scene, the existing concepts and boundaries of physical perceptions are blurred. Sebum layer physical simulation shapes the “Topical-Similarity" immersing, leading the contemporary social practice communities, groups, network users with a kind of illusion without the presence, i.e. a non-real illusion. From the investigation and discussion of literatures, digital movies editing manufacture and produce the variability characteristics of time (for example, slices, rupture, set, and reset) are analyzed. Interactive eBook has an unique interaction in "Waiting-Greeting" and "Expectation-Response" that makes the operation of image narrative structure more interpretations functionally. The works of digital editing and interactive technology are combined and further analyze concept and results. After digitization of Interventional Imaging and interactive technology, real events exist linked and the media handing cannot be cut relationship through movies, interactive art, practical case discussion and analysis. Audience needs more rational thinking about images carried by the authenticity of the text.Keywords: sebum layer, topical and similarity, interactive technology, image narrative
Procedia PDF Downloads 3899009 Improvement of Process Competitiveness Using Intelligent Reference Models
Authors: Julio Macedo
Abstract:
Several methodologies are now available to conceive the improvements of a process so that it becomes competitive as for example total quality, process reengineering, six sigma, define measure analysis improvement control method. These improvements are of different nature and can be external to the process represented by an optimization model or a discrete simulation model. In addition, the process stakeholders are several and have different desired performances for the process. Hence, the methodologies above do not have a tool to aid in the conception of the required improvements. In order to fill this void we suggest the use of intelligent reference models. A reference model is a set of qualitative differential equations and an objective function that minimizes the gap between the current and the desired performance indexes of the process. The reference models are intelligent so when they receive the current state of the problematic process and the desired performance indexes they generate the required improvements for the problematic process. The reference models are fuzzy cognitive maps added with an objective function and trained using the improvements implemented by the high performance firms. Experiments done in a set of students show the reference models allow them to conceive more improvements than students that do not use these models.Keywords: continuous improvement, fuzzy cognitive maps, process competitiveness, qualitative simulation, system dynamics
Procedia PDF Downloads 879008 Comparative Analysis of Islamic Bank in Indonesia and Malaysia with Risk Profile, Good Corporate Governance, Earnings, and Capital Method: Performance of Business Function and Social Function Perspective
Authors: Achsania Hendratmi, Nisful Laila, Fatin Fadhilah Hasib, Puji Sucia Sukmaningrum
Abstract:
This study aims to compare and see the differences between Islamic bank in Indonesia and Islamic bank in Malaysia using RGEC method (Risk Profile, Good Corporate Governance, Earnings, and Capital). This study examines the comparison in business and social performance of eleven Islamic banks in Indonesia and fifteen Islamic banks in Malaysia. This research used quantitative approach and the collections of data was done by collecting all the annual reports of banks that has been created as a sample over the period 2011-2015. The test result of the Independent Samples T-test and Mann-Whitney Test showed there were differences in the business performance of Islamic Bank in Indonesia and Malaysia as seen from the aspect of Risk profile (FDR), GCG, and Earnings (ROA). Also, there were differences of business and social performance as seen from Earnings (ROE), Capital (CAR), and Sharia Conformity Indicator (PSR and ZR) aspects.Keywords: business performance, Islamic banks, RGEC, social performance
Procedia PDF Downloads 2949007 Physical, Chemical and Mineralogical Characterization of Construction and Demolition Waste Produced in Greece
Authors: C. Alexandridou, G. N. Angelopoulos, F. A. Coutelieris
Abstract:
Construction industry in Greece consumes annually more than 25 million tons of natural aggregates originating mainly from quarries. At the same time, more than 2 million tons of construction and demolition waste are deposited every year, usually without control, therefore increasing the environmental impact of this sector. A potential alternative for saving natural resources and minimize landfilling, could be the recycling and re-use of Concrete and Demolition Waste (CDW) in concrete production. Moreover, in order to conform to the European legislation, Greece is obliged to recycle non-hazardous construction and demolition waste to a minimum of 70% by 2020. In this paper characterization of recycled materials - commercially and laboratory produced, coarse and fine, Recycled Concrete Aggregates (RCA) - has been performed. Namely, X-Ray Fluorescence and X-ray diffraction (XRD) analysis were used for chemical and mineralogical analysis respectively. Physical properties such as particle density, water absorption, sand equivalent and resistance to fragmentation were also determined. This study, first time made in Greece, aims at outlining the differences between RCA and natural aggregates and evaluating their possible influence in concrete performance. Results indicate that RCA’s chemical composition is enriched in Si, Al, and alkali oxides compared to natural aggregates. X-ray diffraction (XRD) analyses results indicated the presence of calcite, quartz and minor peaks of mica and feldspars. From all the evaluated physical properties of coarse RCA, only water absorption and resistance to fragmentation seem to have a direct influence on the properties of concrete. Low Sand Equivalent and significantly high water absorption values indicate that fine fractions of RCA cannot be used for concrete production unless further processed. Chemical properties of RCA in terms of water soluble ions are similar to those of natural aggregates. Four different concrete mixtures were produced and examined, replacing natural coarse aggregates with RCA by a ratio of 0%, 25%, 50% and 75% respectively. Results indicate that concrete mixtures containing recycled concrete aggregates have a minor deterioration of their properties (3-9% lower compression strength at 28 days) compared to conventional concrete containing the same cement quantity.Keywords: chemical and physical characterization, compressive strength, mineralogical analysis, recycled concrete aggregates, waste management
Procedia PDF Downloads 2349006 Local Spectrum Feature Extraction for Face Recognition
Authors: Muhammad Imran Ahmad, Ruzelita Ngadiran, Mohd Nazrin Md Isa, Nor Ashidi Mat Isa, Mohd ZaizuIlyas, Raja Abdullah Raja Ahmad, Said Amirul Anwar Ab Hamid, Muzammil Jusoh
Abstract:
This paper presents two technique, local feature extraction using image spectrum and low frequency spectrum modelling using GMM to capture the underlying statistical information to improve the performance of face recognition system. Local spectrum features are extracted using overlap sub block window that are mapping on the face image. For each of this block, spatial domain is transformed to frequency domain using DFT. A low frequency coefficient is preserved by discarding high frequency coefficients by applying rectangular mask on the spectrum of the facial image. Low frequency information is non Gaussian in the feature space and by using combination of several Gaussian function that has different statistical properties, the best feature representation can be model using probability density function. The recognition process is performed using maximum likelihood value computed using pre-calculate GMM components. The method is tested using FERET data sets and is able to achieved 92% recognition rates.Keywords: local features modelling, face recognition system, Gaussian mixture models, Feret
Procedia PDF Downloads 6679005 Deepnic, A Method to Transform Each Variable into Image for Deep Learning
Authors: Nguyen J. M., Lucas G., Brunner M., Ruan S., Antonioli D.
Abstract:
Deep learning based on convolutional neural networks (CNN) is a very powerful technique for classifying information from an image. We propose a new method, DeepNic, to transform each variable of a tabular dataset into an image where each pixel represents a set of conditions that allow the variable to make an error-free prediction. The contrast of each pixel is proportional to its prediction performance and the color of each pixel corresponds to a sub-family of NICs. NICs are probabilities that depend on the number of inputs to each neuron and the range of coefficients of the inputs. Each variable can therefore be expressed as a function of a matrix of 2 vectors corresponding to an image whose pixels express predictive capabilities. Our objective is to transform each variable of tabular data into images into an image that can be analysed by CNNs, unlike other methods which use all the variables to construct an image. We analyse the NIC information of each variable and express it as a function of the number of neurons and the range of coefficients used. The predictive value and the category of the NIC are expressed by the contrast and the color of the pixel. We have developed a pipeline to implement this technology and have successfully applied it to genomic expressions on an Affymetrix chip.Keywords: tabular data, deep learning, perfect trees, NICS
Procedia PDF Downloads 909004 Parameter Identification Analysis in the Design of Rock Fill Dams
Authors: G. Shahzadi, A. Soulaimani
Abstract:
This research work aims to identify the physical parameters of the constitutive soil model in the design of a rockfill dam by inverse analysis. The best parameters of the constitutive soil model, are those that minimize the objective function, defined as the difference between the measured and numerical results. The Finite Element code (Plaxis) has been utilized for numerical simulation. Polynomial and neural network-based response surfaces have been generated to analyze the relationship between soil parameters and displacements. The performance of surrogate models has been analyzed and compared by evaluating the root mean square error. A comparative study has been done based on objective functions and optimization techniques. Objective functions are categorized by considering measured data with and without uncertainty in instruments, defined by the least square method, which estimates the norm between the predicted displacements and the measured values. Hydro Quebec provided data sets for the measured values of the Romaine-2 dam. Stochastic optimization, an approach that can overcome local minima, and solve non-convex and non-differentiable problems with ease, is used to obtain an optimum value. Genetic Algorithm (GA), Particle Swarm Optimization (PSO) and Differential Evolution (DE) are compared for the minimization problem, although all these techniques take time to converge to an optimum value; however, PSO provided the better convergence and best soil parameters. Overall, parameter identification analysis could be effectively used for the rockfill dam application and has the potential to become a valuable tool for geotechnical engineers for assessing dam performance and dam safety.Keywords: Rockfill dam, parameter identification, stochastic analysis, regression, PLAXIS
Procedia PDF Downloads 1469003 Generalized Extreme Value Regression with Binary Dependent Variable: An Application for Predicting Meteorological Drought Probabilities
Authors: Retius Chifurira
Abstract:
Logistic regression model is the most used regression model to predict meteorological drought probabilities. When the dependent variable is extreme, the logistic model fails to adequately capture drought probabilities. In order to adequately predict drought probabilities, we use the generalized linear model (GLM) with the quantile function of the generalized extreme value distribution (GEVD) as the link function. The method maximum likelihood estimation is used to estimate the parameters of the generalized extreme value (GEV) regression model. We compare the performance of the logistic and the GEV regression models in predicting drought probabilities for Zimbabwe. The performance of the regression models are assessed using the goodness-of-fit tests, namely; relative root mean square error (RRMSE) and relative mean absolute error (RMAE). Results show that the GEV regression model performs better than the logistic model, thereby providing a good alternative candidate for predicting drought probabilities. This paper provides the first application of GLM derived from extreme value theory to predict drought probabilities for a drought-prone country such as Zimbabwe.Keywords: generalized extreme value distribution, general linear model, mean annual rainfall, meteorological drought probabilities
Procedia PDF Downloads 2009002 Cloning and Expression of Azurin: A Protein Having Antitumor and Cell Penetrating Ability
Authors: Mohsina Akhter
Abstract:
Cancer has become a wide spread disease around the globe and takes many lives every year. Different treatments are being practiced but all have potential side effects with somewhat less specificity towards target sites. Pseudomonas aeruginosa is known to secrete a protein azurin with special anti-cancer function. It has unique cell penetrating peptide comprising of 18 amino acids that have ability to enter cancer cells specifically. Reported function of Azurin is to stabilize p53 inside the tumor cells and induces apoptosis through Bax mediated cytochrome c release from mitochondria. At laboratory scale, we have made recombinant azurin through cloning rpTZ57R/T-azu vector into E.coli strain DH-5α and subcloning rpET28-azu vector into E.coli BL21-CodonPlus (DE3). High expression was ensured with IPTG induction at different concentrations then optimized high expression level at 1mM concentration of IPTG for 5 hours. Purification has been done by using Ni+2 affinity chromatography. We have concluded that azurin can be a remarkable improvement in cancer therapeutics if it produces on a large scale. Azurin does not enter into the normal cells so it will prove a safe and secure treatment for patients and prevent them from hazardous anomalies.Keywords: azurin, pseudomonas aeruginosa, cancer, therapeutics
Procedia PDF Downloads 3119001 Planning for Brownfield Regeneration in Malaysia: An Integrated Approach in Creating Sustainable Ex-Landfill Redevelopment
Authors: Mazifah Simis, Azahan Awang, Kadir Arifin
Abstract:
The brownfield regeneration is being implemented in developped countries. However, as a group 1 developing country in the South East Asia, the rapid development and increasing number of urban population in Malaysia have urged the needs to incorporate the brownfield regeneration into its physical planning development. The increasing number of urban ex-landfills is seen as a new resource that could overcome the issues of inadequate urban green space provisions. With regards to the new development approach in urban planning, this perception study aims to identify the sustainable planning approach based on what the stakeholders have in mind. Respondents consist of 375 local communities within four urban ex-landfill areas and 61 landscape architect and town planner officers in the Malaysian Local Authorities. Three main objectives are set to be achieved, which are (i) to identify ex-landfill issues that need to be overcome prior to the ex-landfill redevelopment (ii) to identify the most suitable types of ex-landfill redevelopment, and (iii) to identify the priority function for ex-landfill redevelopment as the public parks. From the data gathered through the survey method, the order of priorities based on stakeholders' perception was produced. The results show different perception among the stakeholders, but they agreed to the development of the public park as the main development. Hence, this study attempts to produce an integrated approach as a model for sustainable ex-landfill redevelopment that could be accepted by the stakeholders as a beneficial future development that could change the image of 296 ex-landfills in Malaysia into the urban public parks by the year 2020.Keywords: brownfield regeneration, ex-landfill redevelopment, integrated approach, stakeholders' perception
Procedia PDF Downloads 3529000 Influences of Separation of the Boundary Layer in the Reservoir Pressure in the Shock Tube
Authors: Bruno Coelho Lima, Joao F.A. Martos, Paulo G. P. Toro, Israel S. Rego
Abstract:
The shock tube is a ground-facility widely used in aerospace and aeronautics science and technology for studies on gas dynamic and chemical-physical processes in gases at high-temperature, explosions and dynamic calibration of pressure sensors. A shock tube in its simplest form is comprised of two separate tubes of equal cross-section by a diaphragm. The diaphragm function is to separate the two reservoirs at different pressures. The reservoir containing high pressure is called the Driver, the low pressure reservoir is called Driven. When the diaphragm is broken by pressure difference, a normal shock wave and non-stationary (named Incident Shock Wave) will be formed in the same place of diaphragm and will get around toward the closed end of Driven. When this shock wave reaches the closer end of the Driven section will be completely reflected. Now, the shock wave will interact with the boundary layer that was created by the induced flow by incident shock wave passage. The interaction between boundary layer and shock wave force the separation of the boundary layer. The aim of this paper is to make an analysis of influences of separation of the boundary layer in the reservoir pressure in the shock tube. A comparison among CDF (Computational Fluids Dynamics), experiments test and analytical analysis were performed. For the analytical analysis, some routines in Python was created, in the numerical simulations (Computational Fluids Dynamics) was used the Ansys Fluent, and the experimental tests were used T1 shock tube located in IEAv (Institute of Advanced Studies).Keywords: boundary layer separation, moving shock wave, shock tube, transient simulation
Procedia PDF Downloads 3158999 Dynamic Process Model for Designing Smart Spaces Based on Context-Awareness and Computational Methods Principles
Authors: Heba M. Jahin, Ali F. Bakr, Zeyad T. Elsayad
Abstract:
As smart spaces can be defined as any working environment which integrates embedded computers, information appliances and multi-modal sensors to remain focused on the interaction between the users, their activity, and their behavior in the space; hence, smart space must be aware of their contexts and automatically adapt to their changing context-awareness, by interacting with their physical environment through natural and multimodal interfaces. Also, by serving the information used proactively. This paper suggests a dynamic framework through the architectural design process of the space based on the principles of computational methods and context-awareness principles to help in creating a field of changes and modifications. It generates possibilities, concerns about the physical, structural and user contexts. This framework is concerned with five main processes: gathering and analyzing data to generate smart design scenarios, parameters, and attributes; which will be transformed by coding into four types of models. Furthmore, connecting those models together in the interaction model which will represent the context-awareness system. Then, transforming that model into a virtual and ambient environment which represents the physical and real environments, to act as a linkage phase between the users and their activities taking place in that smart space . Finally, the feedback phase from users of that environment to be sure that the design of that smart space fulfill their needs. Therefore, the generated design process will help in designing smarts spaces that can be adapted and controlled to answer the users’ defined goals, needs, and activity.Keywords: computational methods, context-awareness, design process, smart spaces
Procedia PDF Downloads 3318998 Effect of High Temperature on Residual Mechanical and Physical Properties of Brick Aggregate Concrete
Authors: Samia Hachemi, Abdelhafid Ounis, W. Heriheri
Abstract:
This paper presents an experimental investigation of high temperatures applied to normal and high performance concrete made with natural coarse aggregates. The experimental results of physical and mechanical properties were compared with those obtained with recycled brick aggregates produced by replacing 30% of natural coarse aggregates by recycled brick aggregates. The following parameters: compressive strength, concrete mass loss, apparent density and water porosity were examined in this experiment. The results show that concrete could be produced by using recycled brick aggregates and reveals that at high temperatures recycled aggregate concrete preformed similar or even better than natural aggregate concrete.Keywords: high temperature, compressive strength, mass loss, recycled brick aggregate
Procedia PDF Downloads 2468997 Date Palm Compreg: A High Quality Bio-Composite of Date Palm Wood
Authors: Mojtaba Soltani, Edi Suhaimi Bakar, Hamid Reza Naji
Abstract:
Date Palm Wood (D.P.W) specimens were impregnated with Phenol formaldehyde (PF) resin at 15% level, using vacuum/pressure method. Three levels of moisture content (MC) (50%, 60%, and 70% ) before pressing stage and three hot pressing times (15, 20, and 30 minutes) were the variables. The boards were prepared at 20% compression rate. The physical properties of specimens such as spring back, thickness swelling and water absorption, and mechanical properties including MOR, MOE were studied and compared between variables. The results indicated that the percentage of MC levels before compression set was the main factor on the properties of the Date Palm Compreg. Also, the results showed that this compregnation method can be used as a good method for making high-quality bio-composite from Date Palm Wood.Keywords: Date palm, phenol formaldehyde resin, high-quality bio-composite, physical and mechanical properties
Procedia PDF Downloads 3518996 Hybrid Gravity Gradient Inversion-Ant Colony Optimization Algorithm for Motion Planning of Mobile Robots
Authors: Meng Wu
Abstract:
Motion planning is a common task required to be fulfilled by robots. A strategy combining Ant Colony Optimization (ACO) and gravity gradient inversion algorithm is proposed for motion planning of mobile robots. In this paper, in order to realize optimal motion planning strategy, the cost function in ACO is designed based on gravity gradient inversion algorithm. The obstacles around mobile robot can cause gravity gradient anomalies; the gradiometer is installed on the mobile robot to detect the gravity gradient anomalies. After obtaining the anomalies, gravity gradient inversion algorithm is employed to calculate relative distance and orientation between mobile robot and obstacles. The relative distance and orientation deduced from gravity gradient inversion algorithm is employed as cost function in ACO algorithm to realize motion planning. The proposed strategy is validated by the simulation and experiment results.Keywords: motion planning, gravity gradient inversion algorithm, ant colony optimization
Procedia PDF Downloads 1378995 Optimizing the Public Policy Information System under the Environment of E-Government
Authors: Qian Zaijian
Abstract:
E-government is one of the hot issues in the current academic research of public policy and management. As the organic integration of information and communication technology (ICT) and public administration, e-government is one of the most important areas in contemporary information society. Policy information system is a basic subsystem of public policy system, its operation affects the overall effect of the policy process or even exerts a direct impact on the operation of a public policy and its success or failure. The basic principle of its operation is information collection, processing, analysis and release for a specific purpose. The function of E-government for public policy information system lies in the promotion of public access to the policy information resources, information transmission through e-participation, e-consultation in the process of policy analysis and processing of information and electronic services in policy information stored, to promote the optimization of policy information systems. However, due to many factors, the function of e-government to promote policy information system optimization has its practical limits. In the building of E-government in our country, we should take such path as adhering to the principle of freedom of information, eliminating the information divide (gap), expanding e-consultation, breaking down information silos and other major path, so as to promote the optimization of public policy information systems.Keywords: China, e-consultation, e-democracy, e-government, e-participation, ICTs, public policy information systems
Procedia PDF Downloads 8658994 Multiscale Syntheses of Knee Collateral Ligament Stresses: Aggregate Mechanics as a Function of Molecular Properties
Authors: Raouf Mbarki, Fadi Al Khatib, Malek Adouni
Abstract:
Knee collateral ligaments play a significant role in restraining excessive frontal motion (varus/valgus rotations). In this investigation, a multiscale frame was developed based on structural hierarchies of the collateral ligaments starting from the bottom (tropocollagen molecule) to up where the fibred reinforced structure established. Experimental data of failure tensile test were considered as the principal driver of the developed model. This model was calibrated statistically using Bayesian calibration due to the high number of unknown parameters. Then the model is scaled up to fit the real structure of the collateral ligaments and simulated under realistic boundary conditions. Predications have been successful in describing the observed transient response of the collateral ligaments during tensile test under pre- and post-damage loading conditions. Collateral ligaments maximum stresses and strengths were observed near to the femoral insertions, a results that is in good agreement with experimental investigations. Also for the first time, damage initiation and propagation were documented with this model as a function of the cross-link density between tropocollagen molecules.Keywords: multiscale model, tropocollagen, fibrils, ligaments commas
Procedia PDF Downloads 1598993 Designing of Tooling Solution for Material Handling in Highly Automated Manufacturing System
Authors: Muhammad Umair, Yuri Nikolaev, Denis Artemov, Ighor Uzhinsky
Abstract:
A flexible manufacturing system is an integral part of a smart factory of industry 4.0 in which every machine is interconnected and works autonomously. Robots are in the process of replacing humans in every industrial sector. As the cyber-physical-system (CPS) and artificial intelligence (AI) are advancing, the manufacturing industry is getting more dependent on computers than human brains. This modernization has boosted the production with high quality and accuracy and shifted from classic production to smart manufacturing systems. However, material handling for such automated productions is a challenge and needs to be addressed with the best possible solution. Conventional clamping systems are designed for manual work and not suitable for highly automated production systems. Researchers and engineers are trying to find the most economical solution for loading/unloading and transportation workpieces from a warehouse to a machine shop for machining operations and back to the warehouse without human involvement. This work aims to propose an advanced multi-shape tooling solution for highly automated manufacturing systems. The currently obtained result shows that it could function well with automated guided vehicles (AGVs) and modern conveyor belts. The proposed solution is following requirements to be automation-friendly, universal for different part geometry and production operations. We used a bottom-up approach in this work, starting with studying different case scenarios and their limitations and finishing with the general solution.Keywords: artificial intelligence, cyber physics system, Industry 4.0, material handling, smart factory, flexible manufacturing system
Procedia PDF Downloads 1328992 Effect of Low Level Laser Therapy versus Ultrasound on Musculoskeletal Conditions
Authors: Andrew Anis Fakhrey Mosaad
Abstract:
This Musculoskeletal (MSK) conditions are a major contributing factor in disability. This becomes more challenging as the world population is witnessing an increase in the geriatric population. Various treatment strategies are being researched to provide the most effective and non-invasive approach. To date, low-level laser therapy (LLLT) is one of the emerging modalities to treat different musculoskeletal conditions in physical therapy practice. Physical therapy practice uses different modalities to control pain and inflammation. LLLT has been under research for the last two decades and has shown varying results. This literature review focuses on the effectiveness of LLLT in different musculoskeletal conditions. Using search engines of PubMed and Google Scholar, a number of articles have been reviewed based on the inclusion and exclusion criteria. LLLT shows promising results in treating different musculoskeletal conditions. However, clinicians need to ensure to follow the recommended dosage parameters for specific musculoskeletal conditions.Keywords: musculoskeletal conditions, low level laser therapy, ultrasound, wavelength, pain and inflammation
Procedia PDF Downloads 1128991 An Application of Self-Health Risk Assessment among Populations Living in The Vicinity of a Fiber-Cement Roofing Factory
Authors: Phayong Thepaksorn
Abstract:
The objective of this study was to assess whether living in proximity to a roofing fiber cement factory in southern Thailand was associated with physical, mental, social, and spiritual health domains measured in a self-reported health risk assessment (HRA) questionnaire. A cross-sectional study was conducted among community members divided into two groups: near population (living within 0-2 km of factory) and far population (living within 2-5 km of factory)(N=198). A greater proportion of those living far from the factory (65.34%) reported physical health problems than the near group (51.04 %)(p=0.032). This study has demonstrated that the near population group had higher proportion of participants with positive ratings on mental assessment (30.34%) and social health impacts (28.42%) than far population group (10.59% and 16.67 %, respectively) (p<0.001). The near population group (29.79%) had similar proportion of participants with positive ratings in spiritual health impacts compared with far population group (27.08%). Among females, but not males, this study demonstrated that a higher proportion of the near population had a positive summative score for the self-HRA, which included all four health domain, compared to the far population (p <0.001 for females; p=0.154 for males). In conclusion, this self-HRA of physical, mental, social, and spiritual health domains reflected the risk perceptions of populations living in the vicinity of the roofing fiber cement factory. This type of tool can bring attention to population concerns and complaints in the factory’s surrounding community. Our findings may contribute to future development of self-HRA for HIA development procedure in Thailand.Keywords: cement dust, health impact assessment, risk assessment, walk-though survey
Procedia PDF Downloads 3768990 Hybrid Adaptive Modeling to Enhance Robustness of Real-Time Optimization
Authors: Hussain Syed Asad, Richard Kwok Kit Yuen, Gongsheng Huang
Abstract:
Real-time optimization has been considered an effective approach for improving energy efficient operation of heating, ventilation, and air-conditioning (HVAC) systems. In model-based real-time optimization, model mismatches cannot be avoided. When model mismatches are significant, the performance of the real-time optimization will be impaired and hence the expected energy saving will be reduced. In this paper, the model mismatches for chiller plant on real-time optimization are considered. In the real-time optimization of the chiller plant, simplified semi-physical or grey box model of chiller is always used, which should be identified using available operation data. To overcome the model mismatches associated with the chiller model, hybrid Genetic Algorithms (HGAs) method is used for online real-time training of the chiller model. HGAs combines Genetic Algorithms (GAs) method (for global search) and traditional optimization method (i.e. faster and more efficient for local search) to avoid conventional hit and trial process of GAs. The identification of model parameters is synthesized as an optimization problem; and the objective function is the Least Square Error between the output from the model and the actual output from the chiller plant. A case study is used to illustrate the implementation of the proposed method. It has been shown that the proposed approach is able to provide reliability in decision making, enhance the robustness of the real-time optimization strategy and improve on energy performance.Keywords: energy performance, hybrid adaptive modeling, hybrid genetic algorithms, real-time optimization, heating, ventilation, and air-conditioning
Procedia PDF Downloads 4178989 “A Watched Pot Never Boils.” Exploring the Impact of Job Autonomy on Organizational Commitment among New Employees: A Comprehensive Study of How Empowerment and Independence Influence Workplace Loyalty and Engagement in Early Career Stages
Authors: Atnafu Ashenef Wondim
Abstract:
In today’s highly competitive business environment, employees are considered a source of competitive advantage. Researchers have looked into job autonomy's effect on organizational commitment and declared superior organizational performance strongly depends on the effort and commitment of employees. The purpose of this study was to explore the relationship between job autonomy and organizational commitment from newcomer’s point of view. The mediation role of employee engagement (physical, emotional, and cognitive) was also examined in the case of Ethiopian Commercial Banks. An exploratory survey research design with mixed-method approach that included partial least squares structural equation modeling and Fuzzy-Set Qualitative Comparative Analysis technique were using to address the sample size of 348 new employees. In-depth interviews with purposive and convenientsampling techniques are conducted with new employees (n=43). The results confirmed that job autonomy had positive, significant direct effects on physical engagement, emotional engagement, and cognitive engagement (path coeffs. = 0.874, 0.931, and 0.893).The results showed thatthe employee engagement driver, physical engagement, had a positive significant influence on affective commitment (path coeff. = 0.187) and normative commitment (path coeff. = 0.512) but no significant effect on continuance commitment. Employee engagement partially mediates the relationship between job autonomy and organizational commitment, which means supporting the indirect effects of job autonomy on affective, continuance, and normative commitment through physical engagement. The findings of this study add new perspectives by positioning it within a complex organizational African setting and by expanding the job autonomy and organizational commitment literature, which will benefit future research. Much of the literature on job autonomy and organizational commitment has been conducted within a well-established organizational business context in Western developed countries.The findings lead to fresh information on job autonomy and organizational commitment implementation enablers that can assist in the formulation of a better policy/strategy to efficiently adopt job autonomy and organizational commitment.Keywords: employee engagement, job autonomy, organizational commitment, social exchange theory
Procedia PDF Downloads 298988 Transfer Function Model-Based Predictive Control for Nuclear Core Power Control in PUSPATI TRIGA Reactor
Authors: Mohd Sabri Minhat, Nurul Adilla Mohd Subha
Abstract:
The 1MWth PUSPATI TRIGA Reactor (RTP) in Malaysia Nuclear Agency has been operating more than 35 years. The existing core power control is using conventional controller known as Feedback Control Algorithm (FCA). It is technically challenging to keep the core power output always stable and operating within acceptable error bands for the safety demand of the RTP. Currently, the system could be considered unsatisfactory with power tracking performance, yet there is still significant room for improvement. Hence, a new design core power control is very important to improve the current performance in tracking and regulating reactor power by controlling the movement of control rods that suit the demand of highly sensitive of nuclear reactor power control. In this paper, the proposed Model Predictive Control (MPC) law was applied to control the core power. The model for core power control was based on mathematical models of the reactor core, MPC, and control rods selection algorithm. The mathematical models of the reactor core were based on point kinetics model, thermal hydraulic models, and reactivity models. The proposed MPC was presented in a transfer function model of the reactor core according to perturbations theory. The transfer function model-based predictive control (TFMPC) was developed to design the core power control with predictions based on a T-filter towards the real-time implementation of MPC on hardware. This paper introduces the sensitivity functions for TFMPC feedback loop to reduce the impact on the input actuation signal and demonstrates the behaviour of TFMPC in term of disturbance and noise rejections. The comparisons of both tracking and regulating performance between the conventional controller and TFMPC were made using MATLAB and analysed. In conclusion, the proposed TFMPC has satisfactory performance in tracking and regulating core power for controlling nuclear reactor with high reliability and safety.Keywords: core power control, model predictive control, PUSPATI TRIGA reactor, TFMPC
Procedia PDF Downloads 2418987 Advances in Food Processing Using Extrusion Technology
Authors: Javeed Akhtar, R. K. Pandey, Z. R. Azaz Ahmad Azad
Abstract:
For the purpose of making different uses of food material for the development of extruded foods are produced using single and twin extruders. Extrusion cooking is a useful and economical tool for processing of novel food. This high temperature, short time processing technology causes chemical and physical changes that alter the nutritional and physical quality of the product. Extrusion processing of food ingredients characteristically depends on associating process conditions that influence the product qualities. The process parameters are optimized for extrusion of food material in order to obtain the maximum nutritive value by inactivating the anti-nutritional factors. The processing conditions such as moisture content, temperature and time are controlled to avoid over heating or under heating which otherwise would result in a product of lower nutritional quality.Keywords: extrusion processing, single and twin extruder, operating condition of extruders and extruded novel foods, food and agricultural engineering
Procedia PDF Downloads 382