Search results for: data reduction
28214 Ion Beam Polishing of Si in W/Si Multilayer X-Ray Analyzers
Authors: Roman Medvedev, Andrey Yakshin, Konstantin Nikolaev, Sergey Yakunin, Fred Bijkerk
Abstract:
Multilayer structures are used as spectroscopic elements in fluorescence analysis. These serve the purpose of analyzing soft x-ray emission spectra of materials upon excitation by x-rays or electrons. The analysis then allows quantitative determination of the x-ray emitting elements in the materials. Shorter wavelength range for this application, below 2.5nm, can be covered by using short period multilayers, with a period of 2.5 nm and lower. Thus the detrimental effect on the reflectivity of morphological roughness between materials of the multilayers becomes increasingly pronounced. Ion beam polishing was previously shown to be effective in reducing roughness in some multilayer systems with Si. In this work, we explored W/Si multilayers with the period of 2.5 nm. Si layers were polishing by Ar ions, employing low energy ions, 100 and 80 eV, with the etched Si thickness being in the range 0.1 to 0.5 nm. CuK X-ray diffuse scattering measurements revealed a significant reduction in the diffused scattering in the polished multilayers. However, Grazing Incidence CuK X-ray showed only a marginal reduction of the overall roughness of the systems. Still, measurements of the structures with Grazing Incidence Small Angle X-ray scattering indicated that the vertical correlation length of roughness was strongly reduced in the polished multilayers. These results together suggest that polishing results in the reduction of the vertical propagation of roughness from layer to layer, while only slightly affecting the overall roughness. This phenomenon can be explained by ion-induced surface roughening inherently present in the ion polishing methods. Alternatively, ion-induced densification of thin Si films should also be considered. Finally, the reflectivity of 40% at 0.84 nm at grazing incidence of 9 degrees has been obtained in this work for W/Si multilayers. Analysis of the obtained results is expected to lead to further progress in reflectance.Keywords: interface roughness, ion polishing, multilayer structures, W/Si
Procedia PDF Downloads 13528213 Customer Data Analysis Model Using Business Intelligence Tools in Telecommunication Companies
Authors: Monica Lia
Abstract:
This article presents a customer data analysis model using business intelligence tools for data modelling, transforming, data visualization and dynamic reports building. Economic organizational customer’s analysis is made based on the information from the transactional systems of the organization. The paper presents how to develop the data model starting for the data that companies have inside their own operational systems. The owned data can be transformed into useful information about customers using business intelligence tool. For a mature market, knowing the information inside the data and making forecast for strategic decision become more important. Business Intelligence tools are used in business organization as support for decision-making.Keywords: customer analysis, business intelligence, data warehouse, data mining, decisions, self-service reports, interactive visual analysis, and dynamic dashboards, use cases diagram, process modelling, logical data model, data mart, ETL, star schema, OLAP, data universes
Procedia PDF Downloads 43428212 The Result of Suggestion for Low Energy Diet (1,000-1,200 kcal) in Obese Women to the Effect on Body Weight, Waist Circumference, and BMI
Authors: S. Kumchoo
Abstract:
The result of suggestion for low energy diet (1,000-1,200 kcal) in obese women to the effect on body weight, waist circumference and body mass index (BMI) in this experiment. Quisi experimental research was used for this study and it is a One-group pretest-posttest designs measurement method. The aim of this study was body weight, waist circumference and body mass index (BMI) reduction by using low energy diet (1,000-1,200 kcal) in obese women, the result found that in 15 of obese women that contained their body mass index (BMI) ≥ 30, after they obtained low energy diet (1,000-1,200 kcal) within 2 weeks. The data were collected before and after of testing the results showed that the average of body weight decrease 3.4 kilogram, waist circumference value decrease 6.1 centimeter and the body mass index (BMI) decrease 1.3 kg.m2 from their previous body weight, waist circumference and body mass index (BMI) before experiment started. After this study, the volunteers got healthy and they can choose or select some food for themselves. For this study, the research can be improved for data development for forward study in the future.Keywords: body weight, waist circumference, low energy diet, BMI
Procedia PDF Downloads 38828211 Flow Duration Curves and Recession Curves Connection through a Mathematical Link
Authors: Elena Carcano, Mirzi Betasolo
Abstract:
This study helps Public Water Bureaus in giving reliable answers to water concession requests. Rapidly increasing water requests can be supported provided that further uses of a river course are not totally compromised, and environmental features are protected as well. Strictly speaking, a water concession can be considered a continuous drawing from the source and causes a mean annual streamflow reduction. Therefore, deciding if a water concession is appropriate or inappropriate seems to be easily solved by comparing the generic demand to the mean annual streamflow value at disposal. Still, the immediate shortcoming for such a comparison is that streamflow data are information available only for few catchments and, most often, limited to specific sites. Subsequently, comparing the generic water demand to mean daily discharge is indeed far from being completely satisfactory since the mean daily streamflow is greater than the water withdrawal for a long period of a year. Consequently, such a comparison appears to be of little significance in order to preserve the quality and the quantity of the river. In order to overcome such a limit, this study aims to complete the information provided by flow duration curves introducing a link between Flow Duration Curves (FDCs) and recession curves and aims to show the chronological sequence of flows with a particular focus on low flow data. The analysis is carried out on 25 catchments located in North-Eastern Italy for which daily data are provided. The results identify groups of catchments as hydrologically homogeneous, having the lower part of the FDCs (corresponding streamflow interval is streamflow Q between 300 and 335, namely: Q(300), Q(335)) smoothly reproduced by a common recession curve. In conclusion, the results are useful to provide more reliable answers to water request, especially for those catchments which show similar hydrological response and can be used for a focused regionalization approach on low flow data. A mathematical link between streamflow duration curves and recession curves is herein provided, thus furnishing streamflow duration curves information upon a temporal sequence of data. In such a way, by introducing assumptions on recession curves, the chronological sequence upon low flow data can also be attributed to FDCs, which are known to lack this information by nature.Keywords: chronological sequence of discharges, recession curves, streamflow duration curves, water concession
Procedia PDF Downloads 18928210 Multivariate Data Analysis for Automatic Atrial Fibrillation Detection
Authors: Zouhair Haddi, Stephane Delliaux, Jean-Francois Pons, Ismail Kechaf, Jean-Claude De Haro, Mustapha Ouladsine
Abstract:
Atrial fibrillation (AF) has been considered as the most common cardiac arrhythmia, and a major public health burden associated with significant morbidity and mortality. Nowadays, telemedical approaches targeting cardiac outpatients situate AF among the most challenged medical issues. The automatic, early, and fast AF detection is still a major concern for the healthcare professional. Several algorithms based on univariate analysis have been developed to detect atrial fibrillation. However, the published results do not show satisfactory classification accuracy. This work was aimed at resolving this shortcoming by proposing multivariate data analysis methods for automatic AF detection. Four publicly-accessible sets of clinical data (AF Termination Challenge Database, MIT-BIH AF, Normal Sinus Rhythm RR Interval Database, and MIT-BIH Normal Sinus Rhythm Databases) were used for assessment. All time series were segmented in 1 min RR intervals window and then four specific features were calculated. Two pattern recognition methods, i.e., Principal Component Analysis (PCA) and Learning Vector Quantization (LVQ) neural network were used to develop classification models. PCA, as a feature reduction method, was employed to find important features to discriminate between AF and Normal Sinus Rhythm. Despite its very simple structure, the results show that the LVQ model performs better on the analyzed databases than do existing algorithms, with high sensitivity and specificity (99.19% and 99.39%, respectively). The proposed AF detection holds several interesting properties, and can be implemented with just a few arithmetical operations which make it a suitable choice for telecare applications.Keywords: atrial fibrillation, multivariate data analysis, automatic detection, telemedicine
Procedia PDF Downloads 26928209 Composite Distributed Generation and Transmission Expansion Planning Considering Security
Authors: Amir Lotfi, Seyed Hamid Hosseini
Abstract:
During the recent past, due to the increase of electrical energy demand and governmental resources constraints in creating additional capacity in the generation, transmission, and distribution, privatization, and restructuring in electrical industry have been considered. So, in most of the countries, different parts of electrical industry like generation, transmission, and distribution have been separated in order to create competition. Considering these changes, environmental issues, energy growth, investment of private equity in energy generation units and difficulties of transmission lines expansion, distributed generation (DG) units have been used in power systems. Moreover, reduction in the need for transmission and distribution, the increase of reliability, improvement of power quality, and reduction of power loss have caused DG to be placed in power systems. On the other hand, considering low liquidity need, private investors tend to spend their money for DGs. In this project, the main goal is to offer an algorithm for planning and placing DGs in order to reduce the need for transmission and distribution network.Keywords: planning, transmission, distributed generation, power security, power systems
Procedia PDF Downloads 48128208 Investigation of the Cathodic Behavior of AA2024-T3 in Neutral Medium
Authors: Nisrine Benzbiria, Mohammed Azzi, Mustapha Zertoubi
Abstract:
2XXX series of aluminum alloys are widely employed in several applications, such as beverages, automotive, and aerospace industries. However, they are particularly prone to localized corrosion, such as pitting, often induced by a difference in corrosion potential measured for intermetallic phases and pure metal. The galvanic cells comprising Al–Cu– Mn–Fe intermetallic phases control cathodically the dissolution rate as oxygen reduction reaction kinetics are privileged on Al–Cu–Mn–Fe particles. Hence, understanding the properties of cathode sites and the processes involved must be carried out. Our interest is to outline the cathodic behavior of AA2024-T3 in sodium sulfate solution using electrochemical techniques. Oxygen reduction reaction (ORR) was investigated in the mixed charge transfer and mass transport regime using the Koutecky-Levich approach. An environmentally benign inhibitor was considered to slow the ORR on the Cu-rich cathodic phases. The surface morphology of the electrodes was investigated with SEM/EDS and AFM. The obtained results were discussed accordingly.Keywords: AA2024-T3, neutral medium, ORR kinetics, Koutecky-Levich, DFT
Procedia PDF Downloads 5328207 Language Errors Used in “The Space between Us” Movie and Their Effects on Translation Quality: Translation Study toward Discourse Analysis Approach
Authors: Mochamad Nuruz Zaman, Mangatur Rudolf Nababan, M. A. Djatmika
Abstract:
Both society and education areas teach to have good communication for building the interpersonal skills up. Everyone has the capacity to understand something new, either well comprehension or worst understanding. Worst understanding makes the language errors when the interactions are done by someone in the first meeting, and they do not know before it because of distance area. “The Space between Us” movie delivers the love-adventure story between Mars Boy and Earth Girl. They are so many missing conversations because of the different climate and environment. As the moviegoer also must be focused on the subtitle in order to enjoy well the movie. Furthermore, Indonesia subtitle and English conversation on the movie still have overlapping understanding in the translation. Translation hereby consists of source language -SL- (English conversation) and target language -TL- (Indonesia subtitle). These research gap above is formulated in research question by how the language errors happened in that movie and their effects on translation quality which is deepest analyzed by translation study toward discourse analysis approach. The research goal is to expand the language errors and their translation qualities in order to create a good atmosphere in movie media. The research is studied by embedded research in qualitative design. The research locations consist of setting, participant, and event as focused determined boundary. Sources of datum are “The Space between Us” movie and informant (translation quality rater). The sampling is criterion-based sampling (purposive sampling). Data collection techniques use content analysis and questioner. Data validation applies data source and method triangulation. Data analysis delivers domain, taxonomy, componential, and cultural theme analysis. Data findings on the language errors happened in the movie are referential, register, society, textual, receptive, expressive, individual, group, analogical, transfer, local, and global errors. Data discussions on their effects to translation quality are concentrated by translation techniques on their data findings; they are amplification, borrowing, description, discursive creation, established equivalent, generalization, literal, modulation, particularization, reduction, substitution, and transposition.Keywords: discourse analysis, language errors, The Space between Us movie, translation techniques, translation quality instruments
Procedia PDF Downloads 21928206 Modelling Sudden Deaths from Myocardial Infarction and Stroke
Authors: Y. S. Yusoff, G. Streftaris, H. R Waters
Abstract:
Death within 30 days is an important factor to be looked into, as there is a significant risk of deaths immediately following or soon after, Myocardial Infarction (MI) or stroke. In this paper, we will model the deaths within 30 days following a Myocardial Infarction (MI) or stroke in the UK. We will see how the probabilities of sudden deaths from MI or stroke have changed over the period 1981-2000. We will model the sudden deaths using a Generalized Linear Model (GLM), fitted using the R statistical package, under a Binomial distribution for the number of sudden deaths. We parameterize our model using the extensive and detailed data from the Framingham Heart Study, adjusted to match UK rates. The results show that there is a reduction for the sudden deaths following a MI over time but no significant improvement for sudden deaths following a stroke.Keywords: sudden deaths, myocardial infarction, stroke, ischemic heart disease
Procedia PDF Downloads 28928205 Optimization of the Administration of Intravenous Medication by Reduction of the Residual Volume, Taking User-Friendliness, Cost Efficiency, and Safety into Account
Authors: A. Poukens, I. Sluyts, A. Krings, J. Swartenbroekx, D. Geeroms, J. Poukens
Abstract:
Introduction and Objectives: It has been known for many years that with the administration of intravenous medication, a rather significant part of the planned to be administered infusion solution, the residual volume ( the volume that remains in the IV line and or infusion bag), does not reach the patient and is wasted. This could possibly result in under dosage and diminished therapeutic effect. Despite the important impact on the patient, the reduction of residual volume lacks attention. An optimized and clearly stated protocol concerning the reduction of residual volume in an IV line is necessary for each hospital. As described in my Master’s thesis, acquiring the degree of Master in Hospital Pharmacy, administration of intravenous medication can be optimized by reduction of the residual volume. Herewith effectiveness, user-friendliness, cost efficiency and safety were taken into account. Material and Methods: By usage of a literature study and an online questionnaire sent out to all Flemish hospitals and hospitals in the Netherlands (province Limburg), current flush methods could be mapped out. In laboratory research, possible flush methods aiming to reduce the residual volume were measured. Furthermore, a self-developed experimental method to reduce the residual volume was added to the study. The current flush methods and the self-developed experimental method were compared to each other based on cost efficiency, user-friendliness and safety. Results: There is a major difference between the Flemish and the hospitals in the Netherlands (Province Limburg) concerning the approach and method of flushing IV lines after administration of intravenous medication. The residual volumes were measured and laboratory research showed that if flushing was done minimally 1-time equivalent to the residual volume, 95 percent of glucose would be flushed through. Based on the comparison, it became clear that flushing by use of a pre-filled syringe would be the most cost-efficient, user-friendly and safest method. According to laboratory research, the self-developed experimental method is feasible and has the advantage that the remaining fraction of the medication can be administered to the patient in unchanged concentration without dilution. Furthermore, this technique can be applied regardless of the level of the residual volume. Conclusion and Recommendations: It is recommendable to revise the current infusion systems and flushing methods in most hospitals. Aside from education of the hospital staff and alignment on a uniform substantiated protocol, an optimized and clear policy on the reduction of residual volume is necessary for each hospital. It is recommended to flush all IV lines with rinsing fluid with at least the equivalent volume of the residual volume. Further laboratory and clinical research for the self-developed experimental method are needed before this method can be implemented clinically in a broader setting.Keywords: intravenous medication, infusion therapy, IV flushing, residual volume
Procedia PDF Downloads 13628204 The Linkage of Urban and Energy Planning for Sustainable Cities: The Case of Denmark and Germany
Authors: Jens-Phillip Petersen
Abstract:
The reduction of GHG emissions in buildings is a focus area of national energy policies in Europe, because buildings are responsible for a major share of the final energy consumption. It is at local scale where policies to increase the share of renewable energies and energy efficiency measures get implemented. Municipalities, as local authorities and responsible entity for land-use planning, have a direct influence on urban patterns and energy use, which makes them key actors in the transition towards sustainable cities. Hence, synchronizing urban planning with energy planning offers great potential to increase society’s energy-efficiency; this has a high significance to reach GHG-reduction targets. In this paper, the actual linkage of urban planning and energy planning in Denmark and Germany was assessed; substantive barriers preventing their integration and driving factors that lead to successful transitions towards a holistic urban energy planning procedures were identified.Keywords: energy planning, urban planning, renewable energies, sustainable cities
Procedia PDF Downloads 35328203 Inflation and Unemployment Rates as Indicators of the Transition European Union Countries Monetary Policy Orientation
Authors: Elza Jurun, Damir Piplica, Tea Poklepović
Abstract:
Numerous studies carried out in the developed western democratic countries have shown that the ideological framework of the governing party has a significant influence on the monetary policy. The executive authority consisting of a left-wing party gives a higher weight to unemployment suppression and central bank implements a more expansionary monetary policy. On the other hand, right-wing governing party considers the monetary stability to be more important than unemployment suppression and in such a political framework the main macroeconomic objective becomes the inflation rate reduction. The political framework conditions in the transition countries which are new European Union (EU) members are still highly specific in relation to the other EU member countries. In the focus of this paper is the question whether the same monetary policy principles are valid in these transitional countries as well as they apply in developed western democratic EU member countries. The data base consists of inflation rate and unemployment rate for 11 transitional EU member countries covering the period from 2001 to 2012. The essential information for each of these 11 countries and for each year of the observed period is right or left political orientation of the ruling party. In this paper we use t-statistics to test our hypothesis that there are differences in inflation and unemployment between right and left political orientation of the governing party. To explore the influence of different countries, through years and different political orientations descriptive statistics is used. Inflation and unemployment should be strongly negatively correlated through time, which is tested using Pearson correlation coefficient. Regarding the fact whether the governing authority is consisted from left or right politically oriented parties, monetary authorities will adjust its policy setting the higher priority on lower inflation or unemployment reduction.Keywords: inflation rate, monetary policy orientation, transition EU countries, unemployment rate
Procedia PDF Downloads 44328202 A Reduced Ablation Model for Laser Cutting and Laser Drilling
Authors: Torsten Hermanns, Thoufik Al Khawli, Wolfgang Schulz
Abstract:
In laser cutting as well as in long pulsed laser drilling of metals, it can be demonstrated that the ablation shape (the shape of cut faces respectively the hole shape) that is formed approaches a so-called asymptotic shape such that it changes only slightly or not at all with further irradiation. These findings are already known from the ultrashort pulse (USP) ablation of dielectric and semiconducting materials. The explanation for the occurrence of an asymptotic shape in laser cutting and long pulse drilling of metals is identified, its underlying mechanism numerically implemented, tested and clearly confirmed by comparison with experimental data. In detail, there now is a model that allows the simulation of the temporal (pulse-resolved) evolution of the hole shape in laser drilling as well as the final (asymptotic) shape of the cut faces in laser cutting. This simulation especially requires much less in the way of resources, such that it can even run on common desktop PCs or laptops. Individual parameters can be adjusted using sliders – the simulation result appears in an adjacent window and changes in real time. This is made possible by an application-specific reduction of the underlying ablation model. Because this reduction dramatically decreases the complexity of calculation, it produces a result much more quickly. This means that the simulation can be carried out directly at the laser machine. Time-intensive experiments can be reduced and set-up processes can be completed much faster. The high speed of simulation also opens up a range of entirely different options, such as metamodeling. Suitable for complex applications with many parameters, metamodeling involves generating high-dimensional data sets with the parameters and several evaluation criteria for process and product quality. These sets can then be used to create individual process maps that show the dependency of individual parameter pairs. This advanced simulation makes it possible to find global and local extreme values through mathematical manipulation. Such simultaneous optimization of multiple parameters is scarcely possible by experimental means. This means that new methods in manufacturing such as self-optimization can be executed much faster. However, the software’s potential does not stop there; time-intensive calculations exist in many areas of industry. In laser welding or laser additive manufacturing, for example, the simulation of thermal induced residual stresses still uses up considerable computing capacity or is even not possible. Transferring the principle of reduced models promises substantial savings there, too.Keywords: asymptotic ablation shape, interactive process simulation, laser drilling, laser cutting, metamodeling, reduced modeling
Procedia PDF Downloads 21628201 Vibration Based Damage Detection and Stiffness Reduction of Bridges: Experimental Study on a Small Scale Concrete Bridge
Authors: Mirco Tarozzi, Giacomo Pignagnoli, Andrea Benedetti
Abstract:
Structural systems are often subjected to degradation processes due to different kind of phenomena like unexpected loadings, ageing of the materials and fatigue cycles. This is true especially for bridges, in which their safety evaluation is crucial for the purpose of a design of planning maintenance. This paper discusses the experimental evaluation of the stiffness reduction from frequency changes due to uniform damage scenario. For this purpose, a 1:4 scaled bridge has been built in the laboratory of the University of Bologna. It is made of concrete and its cross section is composed by a slab linked to four beams. This concrete deck is 6 m long and 3 m wide, and its natural frequencies have been identified dynamically by exciting it with an impact hammer, a dropping weight, or by walking on it randomly. After that, a set of loading cycles has been applied to this bridge in order to produce a uniformly distributed crack pattern. During the loading phase, either cracking moment and yielding moment has been reached. In order to define the relationship between frequency variation and loss in stiffness, the identification of the natural frequencies of the bridge has been performed, before and after the occurrence of the damage, corresponding to each load step. The behavior of breathing cracks and its effect on the natural frequencies has been taken into account in the analytical calculations. By using a sort of exponential function given from the study of lot of experimental tests in the literature, it has been possible to predict the stiffness reduction through the frequency variation measurements. During the load test also crack opening and middle span vertical displacement has been monitored.Keywords: concrete bridge, damage detection, dynamic test, frequency shifts, operational modal analysis
Procedia PDF Downloads 18528200 Opening up Government Datasets for Big Data Analysis to Support Policy Decisions
Authors: K. Hardy, A. Maurushat
Abstract:
Policy makers are increasingly looking to make evidence-based decisions. Evidence-based decisions have historically used rigorous methodologies of empirical studies by research institutes, as well as less reliable immediate survey/polls often with limited sample sizes. As we move into the era of Big Data analytics, policy makers are looking to different methodologies to deliver reliable empirics in real-time. The question is not why did these people do this for the last 10 years, but why are these people doing this now, and if the this is undesirable, and how can we have an impact to promote change immediately. Big data analytics rely heavily on government data that has been released in to the public domain. The open data movement promises greater productivity and more efficient delivery of services; however, Australian government agencies remain reluctant to release their data to the general public. This paper considers the barriers to releasing government data as open data, and how these barriers might be overcome.Keywords: big data, open data, productivity, data governance
Procedia PDF Downloads 37228199 Model Order Reduction of Complex Airframes Using Component Mode Synthesis for Dynamic Aeroelasticity Load Analysis
Authors: Paul V. Thomas, Mostafa S. A. Elsayed, Denis Walch
Abstract:
Airframe structural optimization at different design stages results in new mass and stiffness distributions which modify the critical design loads envelop. Determination of aircraft critical loads is an extensive analysis procedure which involves simulating the aircraft at thousands of load cases as defined in the certification requirements. It is computationally prohibitive to use a Global Finite Element Model (GFEM) for the load analysis, hence reduced order structural models are required which closely represent the dynamic characteristics of the GFEM. This paper presents the implementation of Component Mode Synthesis (CMS) method for the generation of high fidelity Reduced Order Model (ROM) of complex airframes. Here, sub-structuring technique is used to divide the complex higher order airframe dynamical system into a set of subsystems. Each subsystem is reduced to fewer degrees of freedom using matrix projection onto a carefully chosen reduced order basis subspace. The reduced structural matrices are assembled for all the subsystems through interface coupling and the dynamic response of the total system is solved. The CMS method is employed to develop the ROM of a Bombardier Aerospace business jet which is coupled with an aerodynamic model for dynamic aeroelasticity loads analysis under gust turbulence. Another set of dynamic aeroelastic loads is also generated employing a stick model of the same aircraft. Stick model is the reduced order modelling methodology commonly used in the aerospace industry based on stiffness generation by unitary loading application. The extracted aeroelastic loads from both models are compared against those generated employing the GFEM. Critical loads Modal participation factors and modal characteristics of the different ROMs are investigated and compared against those of the GFEM. Results obtained show that the ROM generated using Craig Bampton CMS reduction process has a superior dynamic characteristics compared to the stick model.Keywords: component mode synthesis, craig bampton reduction method, dynamic aeroelasticity analysis, model order reduction
Procedia PDF Downloads 21028198 The Result of Suggestion for Low Energy Diet (1,000 kcal-1,200 kcal) in Obese Women to the effect on Body Weight, Waist Circumference, and BMI
Authors: S. Kumchoo
Abstract:
The result of suggestion for low energy diet (1,000-1,200 kcal) in obese women to the effect on body weight, waist circumference and body mass index (BMI) in this experiment. Quisi experimental research was used for this study and it is a One-group pretest-posttest designs measurement method. The aim of this study was body weight, waist circumference and body mass index (BMI) reduction by using low energy diet (1,000-1,200 kcal) in obese women, the result found that in 15 of obese women that contained their body mass index (BMI) ≥ 30, after they obtained low energy diet (1,000-1,200 kcal) within 2 weeks. The data were collected before and after of testing the results showed that the average of body weight decrease 3.4 kilogram, waist circumference value decrease 6.1 centimeter and the body mass index (BMI) decrease 1.3 kg.m2 from their previous body weight, waist circumference and body mass index (BMI) before experiment started. After this study, the volunteers got healthy and they can choose or select some food for themselves. For this study, the research can be improved for data development for forward study in the future.Keywords: body weight, waist circumference, BMI, low energy diet
Procedia PDF Downloads 45628197 Homoleptic Complexes of a Tetraphenylporphyrinatozinc(II)-conjugated 2,2':6',6"-Terpyridine
Authors: Angelo Lanzilotto, Martin Kuss-Petermann, Catherine E. Housecroft, Edwin C. Constable, Oliver S. Wenger
Abstract:
We recently described the synthesis of a new tetraphenylporphyrinatozinc(II)-conjugated 2,2':6',6"-terpyridine (1) in which the tpy domain enables the molecule to act as a metalloligand. The synthetic route to 1 has been optimized, the importance of selecting a particular sequence of synthetic steps will be discussed. Three homoleptic complexes have been prepared, [Zn(1)₂]²⁺, [Fe(1)₂]²⁺ and [Ru(1)₂]²⁺, and have been isolated as the hexafluoridophosphate salts. Spectroelectrochemical measurements have been performed and the spectral changes ascribed to redox processes are partitioned on either the porphyrin or the terpyridine units. Compound 1 undergoes a reversible one-electron oxidation/reduction. The removal/gain of a second electron leads to a further irreversible chemical transformation. For the homoleptic [M(1)₂]²⁺ complexes, a suitable potential can be chosen at which both the oxidation and the reduction of the {ZnTPP} core are reversible. When the homoleptic complex contains a redox active metal such as Fe or Ru, spectroelectrochemistry has been used to investigate the metal to ligand charge transfer (MLCT) transition. The latter is sensitive to the oxidation state of the metal, and electrochemical oxidation of the metal center suppresses it. Detailed spectroelectrochemical studies will be presented.Keywords: homoleptic complexes, spectroelectrochemistry, tetraphenylporphyrinatozinc(II), 2, 2':6', 6"-terpyridine
Procedia PDF Downloads 22128196 Anticandidal and Antibacterial Silver and Silver(Core)-Gold(Shell) Bimetallic Nanoparticles by Fusarium graminearum
Authors: Dipali Nagaonkar, Mahendra Rai
Abstract:
Nanotechnology has experienced significant developments in engineered nanomaterials in the core-shell arrangement. Nanomaterials having nanolayers of silver and gold are of primary interest due to their wide applications in catalytical and biomedical fields. Further, mycosynthesis of nanoparticles has been proved as a sustainable synthetic approach of nanobiotechnology. In this context, we have synthesized silver and silver (core)-gold (shell) bimetallic nanoparticles using a fungal extract of Fusarium graminearum by sequential reduction. The core-shell deposition of nanoparticles was confirmed by the red shift in the surface plasmon resonance from 434 nm to 530 nm with the aid of the UV-Visible spectrophotometer. The mean particle size of Ag and Ag-Au nanoparticles was confirmed by nanoparticle tracking analysis as 37 nm and 50 nm respectively. Quite polydispersed and spherical nanoparticles are evident by TEM analysis. These mycosynthesized bimetallic nanoparticles were tested against some pathogenic bacteria and Candida sp. The antimicrobial analysis confirmed enhanced anticandidal and antibacterial potential of bimetallic nanoparticles over their monometallic counterparts.Keywords: bimetallic nanoparticles, core-shell arrangement, mycosynthesis, sequential reduction
Procedia PDF Downloads 57428195 A Review on Existing Challenges of Data Mining and Future Research Perspectives
Authors: Hema Bhardwaj, D. Srinivasa Rao
Abstract:
Technology for analysing, processing, and extracting meaningful data from enormous and complicated datasets can be termed as "big data." The technique of big data mining and big data analysis is extremely helpful for business movements such as making decisions, building organisational plans, researching the market efficiently, improving sales, etc., because typical management tools cannot handle such complicated datasets. Special computational and statistical issues, such as measurement errors, noise accumulation, spurious correlation, and storage and scalability limitations, are brought on by big data. These unique problems call for new computational and statistical paradigms. This research paper offers an overview of the literature on big data mining, its process, along with problems and difficulties, with a focus on the unique characteristics of big data. Organizations have several difficulties when undertaking data mining, which has an impact on their decision-making. Every day, terabytes of data are produced, yet only around 1% of that data is really analyzed. The idea of the mining and analysis of data and knowledge discovery techniques that have recently been created with practical application systems is presented in this study. This article's conclusion also includes a list of issues and difficulties for further research in the area. The report discusses the management's main big data and data mining challenges.Keywords: big data, data mining, data analysis, knowledge discovery techniques, data mining challenges
Procedia PDF Downloads 11028194 A Systematic Review on Challenges in Big Data Environment
Authors: Rimmy Yadav, Anmol Preet Kaur
Abstract:
Big Data has demonstrated the vast potential in streamlining, deciding, spotting business drifts in different fields, for example, producing, fund, Information Technology. This paper gives a multi-disciplinary diagram of the research issues in enormous information and its procedures, instruments, and system identified with the privacy, data storage management, network and energy utilization, adaptation to non-critical failure and information representations. Other than this, result difficulties and openings accessible in this Big Data platform have made.Keywords: big data, privacy, data management, network and energy consumption
Procedia PDF Downloads 31328193 Evaluation of Nanoparticle Application to Control Formation Damage in Porous Media: Laboratory and Mathematical Modelling
Authors: Gabriel Malgaresi, Sara Borazjani, Hadi Madani, Pavel Bedrikovetsky
Abstract:
Suspension-Colloidal flow in porous media occurs in numerous engineering fields, such as industrial water treatment, the disposal of industrial wastes into aquifers with the propagation of contaminants and low salinity water injection into petroleum reservoirs. The main effects are particle mobilization and captured by the porous rock, which can cause pore plugging and permeability reduction which is known as formation damage. Various factors such as fluid salinity, pH, temperature, and rock properties affect particle detachment. Formation damage is unfavorable specifically near injection and production wells. One way to control formation damage is pre-treatment of the rock with nanoparticles. Adsorption of nanoparticles on fines and rock surfaces alters zeta-potential of the surfaces and enhances the attachment force between the rock and fine particles. The main objective of this study is to develop a two-stage mathematical model for (1) flow and adsorption of nanoparticles on the rock in the pre-treatment stage and (2) fines migration and permeability reduction during the water production after the pre-treatment. The model accounts for adsorption and desorption of nanoparticles, fines migration, and kinetics of particle capture. The system of equations allows for the exact solution. The non-self-similar wave-interaction problem was solved by the Method of Characteristics. The analytical model is new in two ways: First, it accounts for the specific boundary and initial condition describing the injection of nanoparticle and production from the pre-treated porous media; second, it contains the effect of nanoparticle sorption hysteresis. The derived analytical model contains explicit formulae for the concentration fronts along with pressure drop. The solution is used to determine the optimal injection concentration of nanoparticle to avoid formation damage. The mathematical model was validated via an innovative laboratory program. The laboratory study includes two sets of core-flood experiments: (1) production of water without nanoparticle pre-treatment; (2) pre-treatment of a similar core with nanoparticles followed by water production. Positively-charged Alumina nanoparticles with the average particle size of 100 nm were used for the rock pre-treatment. The core was saturated with the nanoparticles and then flushed with low salinity water; pressure drop across the core and the outlet fine concentration was monitored and used for model validation. The results of the analytical modeling showed a significant reduction in the fine outlet concentration and formation damage. This observation was in great agreement with the results of core-flood data. The exact solution accurately describes fines particle breakthroughs and evaluates the positive effect of nanoparticles in formation damage. We show that the adsorbed concentration of nanoparticle highly affects the permeability of the porous media. For the laboratory case presented, the reduction of permeability after 1 PVI production in the pre-treated scenario is 50% lower than the reference case. The main outcome of this study is to provide a validated mathematical model to evaluate the effect of nanoparticles on formation damage.Keywords: nano-particles, formation damage, permeability, fines migration
Procedia PDF Downloads 62328192 Audit of Intraoperative Ventilation Strategy in Prolonged Abdominal Surgery
Authors: Prabir Patel, Eugene Ming Han Lim
Abstract:
Introduction: Current literature shows that postoperative pulmonary complications following abdominal surgery may be reduced by using lower than conventional tidal volumes intraoperatively together with moderate levels of positive end expiratory pressure (PEEP). The recent studies demonstrated significant reduction demonstrated significant reduction in major complications in elective abdominal surgery through the use of lower tidal volumes (6-8 ml/kg predicted body weight), PEEP of 5 cmH20 and recruitment manoeuvres compared to higher ‘conventional’ volumes (10-12 mls/kg PBW) without lung recruitment. Our objective was to retrospectively audit current practice for patients undergoing major abdominal surgery in Sir Charles Gairdner Hospital. Methods: Patients over 18 undergoing elective general surgery lasting more than 3 hours and intubated during the duration of procedure were included in this audit. Data was collected over a 6 month period. Patients who had hepatic surgery, procedures necessitating one-lung ventilation, transplant surgery, documented history of pulmonary or intracranial hypertension were excluded. Results: 58 suitable patients were identified and notes were available for 54 patients. Key findings: Average peak airway pressure was 21cmH20 (+4), average peak airway pressure was less than 30 cmH20 in all patients, and less than 25 cmH20 in 80% of the cases. PEEP was used in 81% of the cases. Where PEEP was used, 75% used PEEP more than or equal to 5 cmH20. Average tidal volume per actual body weight was 7.1 ml/kg (+1.6). Average tidal volume per predicted body weight (PBW) was 8.8 ml/kg (+1.5). Average tidal volume was less than 10 ml/kg PBW in 90% of cases; 6-8 ml/kg PBW in 40% of the cases. There was no recorded use of recruitment manoeuvres in any cases. Conclusions: In the vast majority of patients undergoing prolonged abdominal surgery, a lung protective strategy using moderate levels of PEEP, peak airway pressures of less than 30 cmH20 and tidal volumes of less than 10 cmH20/kg PBW was utilised. A recent randomised control trial demonstrated benefit from utilising even lower volumes (6-8 mls/kg) based on findings in critical care patients, but this was compared to volumes of 10-12 ml/kg. Volumes of 6-8 ml/kg PBW were utilised in 40% of cases in this audit. Although theoretically beneficial, clinical benefit of lower volumes than what is currently practiced in this institution remains to be seen. The incidence of pulmonary complications was much lower than in the other cited studies and a larger data set would be required to investigate any benefit from lower tidal volume ventilation. The volumes used are comparable to results from published local and international data but PEEP utilisation was higher in this audit. Strategies that may potentially be implemented to ensure and maintain best practice include pre-operative recording of predicted body weight, adjustment of default ventilator settings and education/updates of current evidence.Keywords: anaesthesia, intraoperative ventilation, PEEP, tidal volume
Procedia PDF Downloads 76528191 Utilization Of Guar Gum As Functional Fat Replacer In Goshtaba, A Traditional Indian Meat Product
Authors: Sajad A. Rather, F. A. Masoodi, Rehana Akhter, S. M. Wani, Adil Gani
Abstract:
Modern trend towards convenience foods has resulted in increased production and consumption of restructured meat products and are of great importance to the meat industry. In meat products fat plays an important role in cooking properties, texture & sensory scores, however, high fat contents in particular animal fats provide high amounts of saturated fatty acids and cholesterol and are associated with several types of non communicable diseases such as obesity, hypertension and coronary heart diseases. Thus, fat reduction has generally been seen as an important strategy to produce healthier meat products. This study examined the effects of reducing fat level from 20% to 10% and substituting mutton back fat with guar gum (0.5%, 1% & 1.5%) on cooking properties, proximate composition, lipid and protein oxidation, texture, microstructure and sensory characteristics of goshtaba- a traditional meat product of J & K, India were investigated and compared with high fat counterparts. Reduced- fat goshtaba samples containing guar gum had significantly (p ≤ 0.05) higher yield, less shrinkage, more moisture retention and more protein content than the control sample. TBARs and protein oxidation (carbonyl content) values of the control was significantly (p ≤ 0.05) higher than reduced fat goshtaba samples and showed a positive correlation between lipid and protein oxidation. Hardness, gumminess & chewiness of the control (20%) were significantly higher than reduced fat goshtaba samples. Microstructural differences were significant (p ≤ 0.05) between control and treated samples due to an increased moisture content in the reduced fat samples. Sensory evaluation showed significant (p ≤ 0.05) reduction in texture, flavour and overall acceptability scores of treatment products; however the scores for 0.5% and 1% treated samples were in the range of acceptability. Guar gum may also be used as a source of soluble dietary fibre in food products and a number of clinical studies have shown a reduction in postprandial glycemia and insulinemia on consumption of guar gum, with the mechanism being attributed to an increased transit time in the stomach and small intestine, which may have been due to the viscosity of the meal hindering the access of glucose to the epithelium.Keywords: goshtaba, guar gum, traditional, fat reduction, acceptability
Procedia PDF Downloads 28028190 Anti-Inflammatory Activity of Lavandula antineae Maire from Algeria
Authors: Soumeya Krimat, Tahar Dob, Aicha Kesouri, Ahmed Nouasri, Hafidha Metidji
Abstract:
Lavandula antineae Maire is an endemic medicinal plant of Algeria which is traditionally used for the treatment of chills, bruises, oedema and rheumatism. The objective of this study is to evaluate the anti-inflammatory of hydromethanolic aerial parts extract of Lavandula antineae for the first time using carrageenan-paw edema and croton oil-ear odema models. The plant extract, at the dose of 200 mg/kg, showed a significant anti-inflammatory activity (P˂0.05) in the carrageenan induced edema test in mice, showing 80.74% reduction in the paw thikness comparable to that produced by the standard drug aspirin 83.44% at 4h. When it was applied topically at a dosage of 1 and 2 mg per ear, the percent edema reduction in treated mice was 29.45% and 74.76%, respectively. These results demonstrate that Lavandula antineae Maire extract possess remarkable anti-inflammatory activity, supporting the folkloric usage of the plant to treat various inflammatory and pain diseases.Keywords: lavandula antineae maire, medicinal plant, anti-inflammatory activity, carrageenan-paw edema, croton oil-ear edema
Procedia PDF Downloads 39228189 Survey on Big Data Stream Classification by Decision Tree
Authors: Mansoureh Ghiasabadi Farahani, Samira Kalantary, Sara Taghi-Pour, Mahboubeh Shamsi
Abstract:
Nowadays, the development of computers technology and its recent applications provide access to new types of data, which have not been considered by the traditional data analysts. Two particularly interesting characteristics of such data sets include their huge size and streaming nature .Incremental learning techniques have been used extensively to address the data stream classification problem. This paper presents a concise survey on the obstacles and the requirements issues classifying data streams with using decision tree. The most important issue is to maintain a balance between accuracy and efficiency, the algorithm should provide good classification performance with a reasonable time response.Keywords: big data, data streams, classification, decision tree
Procedia PDF Downloads 52228188 Effect of Rural Entrepreneurship in Rural Development in Nigeria: A Study of Selected Entrepreneurs in Ikwuano Local Government Area, Abia State, Nigeria
Authors: Ifeanyi Charles Otuokere, Victoria Nneoma Nnochiri
Abstract:
Entrepreneurship generally and specifically within the rural communities in Nigeria is a fast means of bringing development within the communities. This is made possible by utmost maximization and management of available local resources to develop rural areas through good management of these local resources. This study anchors on the rural development paradigm and the integrated rural development theories to understudy the knowledge of rural entrepreneurs on rural economic development. The research study made use of surveys and descriptive analysis. The assessable population for the study, which was randomly selected, is 100 rural entrepreneurs from ten rural communities within the Ikwuano Local Government Area of Abia State. The study made use of both primary and secondary as a source of data collection with much emphasis on a primary source, although secondary data such as journals, textbooks electronic sources were also utilised. A carefully structured questionnaire drafted to extract raw data was administered to selected entrepreneurs. The findings of the study showed that developments within rural communities can only be achieved through rural entrepreneurship. This is evidenced in increased output, job creation, and most importantly, reduction of rural to urban migration, among other things. Recommendations were also made based on these findings; the researchers recommended that infrastructural developments should be made available in the rural communities and government policies should create enabling environments along with other assistance to help these rural entrepreneurs achieve their sole aim.Keywords: economic developments, rural communities, rural development, rural entrepreneurship
Procedia PDF Downloads 23328187 Robust and Dedicated Hybrid Cloud Approach for Secure Authorized Deduplication
Authors: Aishwarya Shekhar, Himanshu Sharma
Abstract:
Data deduplication is one of important data compression techniques for eliminating duplicate copies of repeating data, and has been widely used in cloud storage to reduce the amount of storage space and save bandwidth. In this process, duplicate data is expunged, leaving only one copy means single instance of the data to be accumulated. Though, indexing of each and every data is still maintained. Data deduplication is an approach for minimizing the part of storage space an organization required to retain its data. In most of the company, the storage systems carry identical copies of numerous pieces of data. Deduplication terminates these additional copies by saving just one copy of the data and exchanging the other copies with pointers that assist back to the primary copy. To ignore this duplication of the data and to preserve the confidentiality in the cloud here we are applying the concept of hybrid nature of cloud. A hybrid cloud is a fusion of minimally one public and private cloud. As a proof of concept, we implement a java code which provides security as well as removes all types of duplicated data from the cloud.Keywords: confidentiality, deduplication, data compression, hybridity of cloud
Procedia PDF Downloads 38428186 Surface Nanocrystalline and Hardening Effects of Ti–Al–V Alloy by Electropulsing Ultrasonic Shock
Authors: Xiaoxin Ye, Guoyi Tang
Abstract:
The effect of electropulsing ultrasonic shock (EUS) on the surface hardening and microstructure of Ti6Al4V alloy was studied. It was found that electropulsing improved the microhardness dramatically both in the influential depth and maximum value, compared with the only ultrasonic-shocked sample. It’s indicated that refined surface layer with nanocrystalline and improved microhardness were obtained on account of surface severe plastic deformation, dynamic recrystallization (DRX) and phase change, which was implemented at relative low temperature and high strain rate/capacity due to the coupling of the thermal and athermal effects of EUS. It’s different from conventional experiments and theory. It’s discussed that the positive contributions of EPT in the thermodynamics and kinetics of microstructure and properties change were attributed to the reduction of nucleation energy barrier and acceleration of atomic diffusion. Therefore, it’s supposed that EUS is an energy-saving and high-efficiency method of surface treatment technique with the help of high-energy electropulses, which is promising in cost reduction of the surface engineering and energy management.Keywords: titanium alloys, electropulsing, ultrasonic shock, microhardness, nanocrystalline
Procedia PDF Downloads 29228185 Intelligent Algorithm-Based Tool-Path Planning and Optimization for Additive Manufacturing
Authors: Efrain Rodriguez, Sergio Pertuz, Cristhian Riano
Abstract:
Tool-path generation is an essential step in the FFF (Fused Filament Fabrication)-based Additive Manufacturing (AM) process planning. In the manufacture of a mechanical part by using additive processes, high resource consumption and prolonged production times are inherent drawbacks of these processes mainly due to non-optimized tool-path generation. In this work, we propose a heuristic-search intelligent algorithm-based approach for optimized tool-path generation for FFF-based AM. The main benefit of this approach is a significant reduction of travels without material deposition when the AM machine performs moves without any extrusion. The optimization method used reduces the number of travels without extrusion in comparison with commercial software as Slic3r or Cura Engine, which means a reduction of production time.Keywords: additive manufacturing, tool-path optimization, fused filament fabrication, process planning
Procedia PDF Downloads 443