Search results for: mixture density function.
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3485

Search results for: mixture density function.

275 The Role of Chemerin and Myostatin after Physical Activity

Authors: M. J. Pourvaghar, M. E. Bahram

Abstract:

Obesity and overweight is one of the most common metabolic disorders in industrialized countries and in developing countries. One consequence of pathological obesity is cardiovascular disease and metabolic syndrome. Chemerin is an adipocyne that plays a role in the regulation of the adipocyte function and the metabolism of glucose in the liver and musculoskeletal system. Most likely, chemerin is involved in obesity-related disorders such as type 2 diabetes and cardiovascular disease. Aerobic exercises reduce the level of chemerin and cause macrophage penetration into fat cells and inflammatory factors. Several efforts have been made to clarify the cellular and molecular mechanisms of hypertrophy and muscular atrophy. Myostatin, a new member of the TGF-β family, is a transforming growth factor β that its expression negatively regulates the growth of the skeletal muscle; and the increase of this hormone has been observed in conditions of muscular atrophy. While in response to muscle overload, its levels decrease after the atrophy period, TGF-β is the most important cytokine in the development of skeletal muscle. Myostatin plays an important role in muscle control, and animal and human studies show a negative role of myostatin in the growth of skeletal muscle. Separation of myostatin from Golgi begins on the ninth day of the onset period and continues until birth at all times of muscle growth. Higher levels of myostatin are found in obese people. Resistance training for 10 weeks could reduce levels of plasma myostatin.

Keywords: Chemerin, myostatin, obesity, physical activity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 728
274 Hybrid Approach for Software Defect Prediction Using Machine Learning with Optimization Technique

Authors: C. Manjula, Lilly Florence

Abstract:

Software technology is developing rapidly which leads to the growth of various industries. Now-a-days, software-based applications have been adopted widely for business purposes. For any software industry, development of reliable software is becoming a challenging task because a faulty software module may be harmful for the growth of industry and business. Hence there is a need to develop techniques which can be used for early prediction of software defects. Due to complexities in manual prediction, automated software defect prediction techniques have been introduced. These techniques are based on the pattern learning from the previous software versions and finding the defects in the current version. These techniques have attracted researchers due to their significant impact on industrial growth by identifying the bugs in software. Based on this, several researches have been carried out but achieving desirable defect prediction performance is still a challenging task. To address this issue, here we present a machine learning based hybrid technique for software defect prediction. First of all, Genetic Algorithm (GA) is presented where an improved fitness function is used for better optimization of features in data sets. Later, these features are processed through Decision Tree (DT) classification model. Finally, an experimental study is presented where results from the proposed GA-DT based hybrid approach is compared with those from the DT classification technique. The results show that the proposed hybrid approach achieves better classification accuracy.

Keywords: Decision tree, genetic algorithm, machine learning, software defect prediction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1416
273 Electroencephalography Activity during Sensory Organization Balance Test

Authors: Tariq Ali Gujar, Anita Hökelmann

Abstract:

Postural balance plays essential role throughout life in daily activities. Somatosensory, visual and vestibular inputs play the fundamental role in maintaining body equilibrium to balance the posture. The aim of this study was to find out electroencephalography (EEG) responses during balance activity of young people during Sensory Organization Balance Test. The outcome of this study will help to create the fitness and neurorehabilitation plan. 25 young people (25 ± 3.1 years) have been analyzed on Balance Master NeuroCom® with the coupling of Brain Vision 32 electrode wireless EEG system during the Sensory Organization Test. From the results it has been found that the balance score of samples is significantly higher under the influence of somatosensory input as compared to visual and vestibular input (p < 0.05). The EEG between somatosensory and visual input to balance the posture showed significantly higher (p < 0.05) alpha and beta activities during somatosensory input in somatosensory, attention and visual functions of the cortex whereas executive and motor functions of the cerebral cortex showed significantly higher (p < 0.05) alpha EEG activity during the visual input. The results suggest that somatosensory and attention function of the cerebral cortex has alpha and beta activity, respectively high during somatosensory and vestibular input in maintaining balance. In patients with balance impairments both physical and cognitive training, including neurofeedback will be helpful to improve balance abilities.

Keywords: Balance, electroencephalography activity, somatosensory, visual, vestibular.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 534
272 Computing Entropy for Ortholog Detection

Authors: Hsing-Kuo Pao, John Case

Abstract:

Biological sequences from different species are called or-thologs if they evolved from a sequence of a common ancestor species and they have the same biological function. Approximations of Kolmogorov complexity or entropy of biological sequences are already well known to be useful in extracting similarity information between such sequences -in the interest, for example, of ortholog detection. As is well known, the exact Kolmogorov complexity is not algorithmically computable. In prac-tice one can approximate it by computable compression methods. How-ever, such compression methods do not provide a good approximation to Kolmogorov complexity for short sequences. Herein is suggested a new ap-proach to overcome the problem that compression approximations may notwork well on short sequences. This approach is inspired by new, conditional computations of Kolmogorov entropy. A main contribution of the empir-ical work described shows the new set of entropy-based machine learning attributes provides good separation between positive (ortholog) and nega-tive (non-ortholog) data - better than with good, previously known alter-natives (which do not employ some means to handle short sequences well).Also empirically compared are the new entropy based attribute set and a number of other, more standard similarity attributes sets commonly used in genomic analysis. The various similarity attributes are evaluated by cross validation, through boosted decision tree induction C5.0, and by Receiver Operating Characteristic (ROC) analysis. The results point to the conclu-sion: the new, entropy based attribute set by itself is not the one giving the best prediction; however, it is the best attribute set for use in improving the other, standard attribute sets when conjoined with them.

Keywords: compression, decision tree, entropy, ortholog, ROC.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1791
271 Effect of Heat Treatment on Mechanical Properties and Wear Behavior of Al7075 Alloy Reinforced with Beryl and Graphene Hybrid Metal Matrix Composites

Authors: Shanawaz Patil, Mohamed Haneef, K. S. Narayanaswamy

Abstract:

In the recent years, aluminum metal matrix composites were most widely used, which are finding wide applications in various field such as automobile, aerospace defense etc., due to their outstanding mechanical properties like low density, light weight, exceptional high levels of strength, stiffness, wear resistance, high temperature resistance, low coefficient of thermal expansion and good formability. In the present work, an effort is made to study the effect of heat treatment on mechanical properties of aluminum 7075 alloy reinforced with constant weight percentage of naturally occurring mineral beryl and varying weight percentage of graphene. The hybrid composites are developed with 0.5 wt. %, 1wt.%, 1.5 wt.% and 2 wt.% of graphene and 6 wt.% of beryl  by stir casting liquid metallurgy route. The cast specimens of unreinforced aluminum alloy and hybrid composite samples were prepared for heat treatment process and subjected to solutionizing treatment (T6) at a temperature of 490±5 oC for 8 hours in a muffle furnace followed by quenching in boiling water. The microstructure analysis of as cast and heat treated hybrid composite specimens are examined by scanning electron microscope (SEM). The tensile test and hardness test of unreinforced aluminum alloy and hybrid composites are examined. The wear behavior is examined by pin-on disc apparatus. The results of as cast specimens and heat treated specimens were compared. The heat treated Al7075-Beryl-Graphene hybrid composite had better properties and significantly improved the ultimate tensile strength, hardness and reduced wear loss when compared to aluminum alloy and  as cast hybrid composites.

Keywords: Beryl, graphene, heat treatment, mechanical properties.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 989
270 Performance Analysis of HSDPA Systems using Low-Density Parity-Check (LDPC)Coding as Compared to Turbo Coding

Authors: K. Anitha Sheela, J. Tarun Kumar

Abstract:

HSDPA is a new feature which is introduced in Release-5 specifications of the 3GPP WCDMA/UTRA standard to realize higher speed data rate together with lower round-trip times. Moreover, the HSDPA concept offers outstanding improvement of packet throughput and also significantly reduces the packet call transfer delay as compared to Release -99 DSCH. Till now the HSDPA system uses turbo coding which is the best coding technique to achieve the Shannon limit. However, the main drawbacks of turbo coding are high decoding complexity and high latency which makes it unsuitable for some applications like satellite communications, since the transmission distance itself introduces latency due to limited speed of light. Hence in this paper it is proposed to use LDPC coding in place of Turbo coding for HSDPA system which decreases the latency and decoding complexity. But LDPC coding increases the Encoding complexity. Though the complexity of transmitter increases at NodeB, the End user is at an advantage in terms of receiver complexity and Bit- error rate. In this paper LDPC Encoder is implemented using “sparse parity check matrix" H to generate a codeword at Encoder and “Belief Propagation algorithm "for LDPC decoding .Simulation results shows that in LDPC coding the BER suddenly drops as the number of iterations increase with a small increase in Eb/No. Which is not possible in Turbo coding. Also same BER was achieved using less number of iterations and hence the latency and receiver complexity has decreased for LDPC coding. HSDPA increases the downlink data rate within a cell to a theoretical maximum of 14Mbps, with 2Mbps on the uplink. The changes that HSDPA enables includes better quality, more reliable and more robust data services. In other words, while realistic data rates are only a few Mbps, the actual quality and number of users achieved will improve significantly.

Keywords: AMC, HSDPA, LDPC, WCDMA, 3GPP.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2012
269 Utilizing Analytic Hierarchy Process to Analyze Consumers- Purchase Evaluation Factors of Smartphones

Authors: Yi-Chung Hu, Yu-Lin Liao

Abstract:

Due to the fast development of technology, the competition of technological products is turbulent; therefore, it is important to understand the market trend, consumers- demand and preferences. As the smartphones are prevalent, the main purpose of this paper is to utilize Analytic Hierarchy Process (AHP) to analyze consumer-s purchase evaluation factors of smartphones. Through the AHP expert questionnaire, the smartphones- main functions are classified as “user interface", “mobile commerce functions", “hardware and software specifications", “entertainment functions" and “appearance and design", five aspects to analyze the weights. Then four evaluation criteria are evaluated under each aspect to rank the weights. Based on an analysis of data shows that consumers consider when purchase factors are “hardware and software specifications", “user interface", “appearance and design", “mobile commerce functions" and “entertainment functions" in sequence. The “hardware and software specifications" aspect obtains the weight of 33.18%; it is the most important factor that consumers are taken into account. In addition, the most important evaluation criteria are central processing unit, operating system, touch screen, and battery function in sequence. The results of the study can be adopted as reference data for mobile phone manufacturers in the future on the design and marketing strategy to satisfy the voice of customer.

Keywords: Analytic Hierarchy Process (AHP), evaluation criteria, purchase evaluation factors, smartphone.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3200
268 A Biomimetic Structural Form: Developing a Paradigm to Attain Vital Sustainability in Tall Architecture

Authors: Osama Al-Sehail

Abstract:

This paper argues for sustainability as a necessity in the evolution of tall architecture. It provides a different mode for dealing with sustainability in tall architecture, taking into consideration the speciality of its typology. To this end, the article develops a Biomimetic Structural Form as a paradigm to attain Vital Sustainability. A Biomimetic Structural Form, which is derived from the amalgamation of biomimicry as an approach for sustainability defining nature as source of knowledge and inspiration in solving humans’ problems and a Structural Form as a catalyst for evolving tall architecture, is a dynamic paradigm emerging from a conceptualizing and morphological process. A Biomimetic Structural Form is a flow system whose different forces and functions tend to be “better”, more "fit", to “survive”, and to be efficient. Through geometry and function—the two aspects of knowledge extracted from nature—the attributes of the Biomimetic Structural Form are formulated. Vital Sustainability is the survival level of sustainability in natural systems through which a system enhances the performance of its internal working and its interaction with the external environment. A Biomimetic Structural Form, in this context, is a medium for evolving tall architecture to emulate natural models in their ways of coexistence with the environment. As an integral part of this article, the sustainable super tall building 3Ts is discussed as a case study of applying Biomimetic Structural Form.   

Keywords: Biomimicry, design in nature, high-rise buildings, sustainability, structural form, tall architecture, vital sustainability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1477
267 3D Modelling and Numerical Analysis of Human Inner Ear by Means of Finite Elements Method

Authors: C. Castro-Egler, A. Durán-Escalante, A. García-González

Abstract:

This paper presents a method to generate a finite element model of the human auditory inner ear system. The geometric model has been realized using 2D images from a virtual model of temporal bones. A point cloud has been gotten manually from those images to construct a whole mesh with hexahedral elements. The main difference with the predecessor models is the spiral shape of the cochlea with its three scales completely defined: scala tympani, scala media and scala vestibuli; which are separate by basilar membrane and Reissner membrane. To validate this model, numerical simulations have been realised with two models: an isolated inner ear and a whole model of human auditory system. Ideal conditions of displacement are applied over the oval window in the isolated Inner Ear model. The whole model is made up of the outer auditory channel, the tympani, the ossicular chain, and the inner ear. The boundary condition for the whole model is 1Pa over the auditory channel entrance. The numerical simulations by FEM have been done using a harmonic analysis with a frequency range between 100-10.000 Hz with an interval of 100Hz. The following results have been carried out: basilar membrane displacement; the scala media pressure according to the cochlea length and the transfer function of the middle ear normalized with the pressure in the tympanic membrane. The basilar membrane displacements and the pressure in the scala media make it possible to validate the response in frequency of the basilar membrane.

Keywords: Finite elements method, human auditory system model, numerical analysis, 3D modelling cochlea.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1489
266 An Approach for Ensuring Data Flow in Freight Delivery and Management Systems

Authors: Aurelija Burinskienė, Dalė Dzemydienė, Arūnas Miliauskas

Abstract:

This research aims at developing the approach for more effective freight delivery and transportation process management. The road congestions and the identification of causes are important, as well as the context information recognition and management. The measure of many parameters during the transportation period and proper control of driver work became the problem. The number of vehicles per time unit passing at a given time and point for drivers can be evaluated in some situations. The collection of data is mainly used to establish new trips. The flow of the data is more complex in urban areas. Herein, the movement of freight is reported in detail, including the information on street level. When traffic density is extremely high in congestion cases, and the traffic speed is incredibly low, data transmission reaches the peak. Different data sets are generated, which depend on the type of freight delivery network. There are three types of networks: long-distance delivery networks, last-mile delivery networks and mode-based delivery networks; the last one includes different modes, in particular, railways and other networks. When freight delivery is switched from one type of the above-stated network to another, more data could be included for reporting purposes and vice versa. In this case, a significant amount of these data is used for control operations, and the problem requires an integrated methodological approach. The paper presents an approach for providing e-services for drivers by including the assessment of the multi-component infrastructure needed for delivery of freights following the network type. The construction of such a methodology is required to evaluate data flow conditions and overloads, and to minimize the time gaps in data reporting. The results obtained show the possibilities of the proposing methodological approach to support the management and decision-making processes with functionality of incorporating networking specifics, by helping to minimize the overloads in data reporting.

Keywords: Transportation networks, freight delivery, data flow, monitoring, e-services.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 546
265 Heat and Mass Transfer Modelling of Industrial Sludge Drying at Different Pressures and Temperatures

Authors: L. Al Ahmad, C. Latrille, D. Hainos, D. Blanc, M. Clausse

Abstract:

A two-dimensional finite volume axisymmetric model is developed to predict the simultaneous heat and mass transfers during the drying of industrial sludge. The simulations were run using COMSOL-Multiphysics 3.5a. The input parameters of the numerical model were acquired from a preliminary experimental work. Results permit to establish correlations describing the evolution of the various parameters as a function of the drying temperature and the sludge water content. The selection and coupling of the equation are validated based on the drying kinetics acquired experimentally at a temperature range of 45-65 °C and absolute pressure range of 200-1000 mbar. The model, incorporating the heat and mass transfer mechanisms at different operating conditions, shows simulated values of temperature and water content. Simulated results are found concordant with the experimental values, only at the first and last drying stages where sludge shrinkage is insignificant. Simulated and experimental results show that sludge drying is favored at high temperatures and low pressure. As experimentally observed, the drying time is reduced by 68% for drying at 65 °C compared to 45 °C under 1 atm. At 65 °C, a 200-mbar absolute pressure vacuum leads to an additional reduction in drying time estimated by 61%. However, the drying rate is underestimated in the intermediate stage. This rate underestimation could be improved in the model by considering the shrinkage phenomena that occurs during sludge drying.

Keywords: Industrial sludge drying, heat transfer, mass transfer, mathematical modelling.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 599
264 Neural Network Models for Actual Cost and Actual Duration Estimation in Construction Projects: Findings from Greece

Authors: Panagiotis Karadimos, Leonidas Anthopoulos

Abstract:

Predicting the actual cost and duration in construction projects concern a continuous and existing problem for the construction sector. This paper addresses this problem with modern methods and data available from past public construction projects. 39 bridge projects, constructed in Greece, with a similar type of available data were examined. Considering each project’s attributes with the actual cost and the actual duration, correlation analysis is performed and the most appropriate predictive project variables are defined. Additionally, the most efficient subgroup of variables is selected with the use of the WEKA application, through its attribute selection function. The selected variables are used as input neurons for neural network models through correlation analysis. For constructing neural network models, the application FANN Tool is used. The optimum neural network model, for predicting the actual cost, produced a mean squared error with a value of 3.84886e-05 and it was based on the budgeted cost and the quantity of deck concrete. The optimum neural network model, for predicting the actual duration, produced a mean squared error with a value of 5.89463e-05 and it also was based on the budgeted cost and the amount of deck concrete.

Keywords: Actual cost and duration, attribute selection, bridge projects, neural networks, predicting models, FANN TOOL, WEKA.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1100
263 Assessing the Sheltering Response in the Middle East: Studying Syrian Camps in Jordan

Authors: Lara A. Alshawawreh, R. Sean Smith, John B. Wood

Abstract:

This study focuses on the sheltering response in the Middle East, specifically through reviewing two Syrian refugee camps in Jordan, involving Zaatari and Azraq. Zaatari camp involved the rapid deployment of tents and shelters over a very short period of time and Azraq was purpose built and pre-planned over a longer period. At present, both camps collectively host more than 133,000 occupants. Field visits were taken to both camps and the main issues and problems in the sheltering response were highlighted through focus group discussions with camp occupants and inspection of shelter habitats. This provided both subjective and objective research data sources. While every case has its own significance and deployment to meet humanitarian needs, there are some common requirements irrespective of geographical region. The results suggest that there is a gap in the suitability of the required habitat needs and what has been provided. It is recommended that the global international response and support could be improved in relation to the habitat form, construction type, layout, function and critically the cultural aspects. Services, health and hygiene are key elements to the shelter habitat provision. The study also identified the amendments to shelters undertaken by the beneficiaries providing insight into their key main requirements. The outcomes from this study could provide an important learning opportunity to develop improved habitat response for future shelters.

Keywords: Culture, post-disaster, refugees, shelters.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1160
262 Memory Effects in Randomly Perturbed Nematic Liquid Crystals

Authors: Amid Ranjkesh, Milan Ambrožič, Samo Kralj

Abstract:

We study the typical domain size and configuration character of a randomly perturbed system exhibiting continuous symmetry breaking. As a model system we use rod-like objects within a cubic lattice interacting via a Lebwohl–Lasher-type interaction. We describe their local direction with a headless unit director field. An example of such systems represents nematic LC or nanotubes. We further introduce impurities of concentration p, which impose the random anisotropy field-type disorder to directors. We study the domain-type pattern of molecules as a function of p, anchoring strength w between a neighboring director and impurity, temperature, history of samples. In simulations we quenched the directors either from the random or homogeneous initial configuration. Our results show that a history of system strongly influences: i) the average domain coherence length; and ii) the range of ordering in the system. In the random case the obtained order is always short ranged (SR). On the contrary, in the homogeneous case, SR is obtained only for strong enough anchoring and large enough concentration p. In other cases, the ordering is either of quasi long range (QLR) or of long range (LR). We further studied memory effects for the random initial configuration. With increasing external ordering field B either QLR or LR is realized.

Keywords: Lebwohl-Lasher model, liquid crystals, disorder, memory effect, orientational order.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1477
261 Modeling Decentralized Source-Separation Systems for Urban Waste Management

Authors: Bernard J.H. Ng, Apostolos Giannis, Victor Chang, Rainer Stegmann, Jing-Yuan Wang

Abstract:

Decentralized eco-sanitation system is a promising and sustainable mode comparing to the century-old centralized conventional sanitation system. The decentralized concept relies on an environmentally and economically sound management of water, nutrient and energy fluxes. Source-separation systems for urban waste management collect different solid waste and wastewater streams separately to facilitate the recovery of valuable resources from wastewater (energy, nutrients). A resource recovery centre constituted for 20,000 people will act as the functional unit for the treatment of urban waste of a high-density population community, like Singapore. The decentralized system includes urine treatment, faeces and food waste co-digestion, and horticultural waste and organic fraction of municipal solid waste treatment in composting plants. A design model is developed to estimate the input and output in terms of materials and energy. The inputs of urine (yellow water, YW) and faeces (brown water, BW) are calculated by considering the daily mean production of urine and faeces by humans and the water consumption of no-mix vacuum toilet (0.2 and 1 L flushing water for urine and faeces, respectively). The food waste (FW) production is estimated to be 150 g wet weight/person/day. The YW is collected and discharged by gravity into tank. It was found that two days are required for urine hydrolysis and struvite precipitation. The maximum nitrogen (N) and phosphorus (P) recovery are 150-266 kg/day and 20-70 kg/day, respectively. In contrast, BW and FW are mixed for co-digestion in a thermophilic acidification tank and later a decentralized/centralized methanogenic reactor is used for biogas production. It is determined that 6.16-15.67 m3/h methane is produced which is equivalent to 0.07-0.19 kWh/ca/day. The digestion residues are treated with horticultural waste and organic fraction of municipal waste in co-composting plants.

Keywords: Decentralization, ecological sanitation, material flow analysis, source-separation

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2885
260 Entropic Measures of a Probability Sample Space and Exponential Type (α, β) Entropy

Authors: Rajkumar Verma, Bhu Dev Sharma

Abstract:

Entropy is a key measure in studies related to information theory and its many applications. Campbell for the first time recognized that the exponential of the Shannon’s entropy is just the size of the sample space, when distribution is uniform. Here is the idea to study exponentials of Shannon’s and those other entropy generalizations that involve logarithmic function for a probability distribution in general. In this paper, we introduce a measure of sample space, called ‘entropic measure of a sample space’, with respect to the underlying distribution. It is shown in both discrete and continuous cases that this new measure depends on the parameters of the distribution on the sample space - same sample space having different ‘entropic measures’ depending on the distributions defined on it. It was noted that Campbell’s idea applied for R`enyi’s parametric entropy of a given order also. Knowing that parameters play a role in providing suitable choices and extended applications, paper studies parametric entropic measures of sample spaces also. Exponential entropies related to Shannon’s and those generalizations that have logarithmic functions, i.e. are additive have been studies for wider understanding and applications. We propose and study exponential entropies corresponding to non additive entropies of type (α, β), which include Havard and Charvˆat entropy as a special case.

Keywords: Sample space, Probability distributions, Shannon’s entropy, R`enyi’s entropy, Non-additive entropies .

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3351
259 Landfill Failure Mobility Analysis: A Probabilistic Approach

Authors: Ali Jahanfar, Brajesh Dubey, Bahram Gharabaghi, Saber Bayat Movahed

Abstract:

Ever increasing population growth of major urban centers and environmental challenges in siting new landfills have resulted in a growing trend in design of mega-landfills some with extraordinary heights and dangerously steep slopes. Landfill failure mobility risk analysis is one of the most uncertain types of dynamic rheology models due to very large inherent variabilities in the heterogeneous solid waste material shear strength properties. The waste flow of three historic dumpsite and two landfill failures were back-analyzed using run-out modeling with DAN-W model. The travel distances of the waste flow during landfill failures were calculated approach by taking into account variability in material shear strength properties. The probability distribution function for shear strength properties of the waste material were grouped into four major classed based on waste material compaction (landfills versus dumpsites) and composition (high versus low quantity) of high shear strength waste materials such as wood, metal, plastic, paper and cardboard in the waste. This paper presents a probabilistic method for estimation of the spatial extent of waste avalanches, after a potential landfill failure, to create maps of vulnerability scores to inform property owners and residents of the level of the risk.

Keywords: Landfill failure, waste flow, Voellmy rheology, friction coefficient, waste compaction and type.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2230
258 Component Based Framework for Authoring and Multimedia Training in Mathematics

Authors: Ion Smeureanu, Marian Dardala, Adriana Reveiu

Abstract:

The new programming technologies allow for the creation of components which can be automatically or manually assembled to reach a new experience in knowledge understanding and mastering or in getting skills for a specific knowledge area. The project proposes an interactive framework that permits the creation, combination and utilization of components that are specific to mathematical training in high schools. The main framework-s objectives are: • authoring lessons by the teacher or the students; all they need are simple operating skills for Equation Editor (or something similar, or Latex); the rest are just drag & drop operations, inserting data into a grid, or navigating through menus • allowing sonorous presentations of mathematical texts and solving hints (easier understood by the students) • offering graphical representations of a mathematical function edited in Equation • storing of learning objects in a database • storing of predefined lessons (efficient for expressions and commands, the rest being calculations; allows a high compression) • viewing and/or modifying predefined lessons, according to the curricula The whole thing is focused on a mathematical expressions minicompiler, storing the code that will be later used for different purposes (tables, graphics, and optimisations). Programming technologies used. A Visual C# .NET implementation is proposed. New and innovative digital learning objects for mathematics will be developed; they are capable to interpret, contextualize and react depending on the architecture where they are assembled.

Keywords: Adaptor, automatic assembly learning component and user control.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1662
257 Stability Optimization of Functionally Graded Pipes Conveying Fluid

Authors: Karam Y. Maalawi, Hanan E.M EL-Sayed

Abstract:

This paper presents an exact analytical model for optimizing stability of thin-walled, composite, functionally graded pipes conveying fluid. The critical flow velocity at which divergence occurs is maximized for a specified total structural mass in order to ensure the economic feasibility of the attained optimum designs. The composition of the material of construction is optimized by defining the spatial distribution of volume fractions of the material constituents using piecewise variations along the pipe length. The major aim is to tailor the material distribution in the axial direction so as to avoid the occurrence of divergence instability without the penalty of increasing structural mass. Three types of boundary conditions have been examined; namely, Hinged-Hinged, Clamped- Hinged and Clamped-Clamped pipelines. The resulting optimization problem has been formulated as a nonlinear mathematical programming problem solved by invoking the MatLab optimization toolbox routines, which implement constrained function minimization routine named “fmincon" interacting with the associated eigenvalue problem routines. In fact, the proposed mathematical models have succeeded in maximizing the critical flow velocity without mass penalty and producing efficient and economic designs having enhanced stability characteristics as compared with the baseline designs.

Keywords: Functionally graded materials, pipe flow, optimumdesign, fluid- structure interaction

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2169
256 Thermo-Mechanical Approach to Evaluate Softening Behavior of Polystyrene: Validation and Modeling

Authors: Salah Al-Enezi, Rashed Al-Zufairi, Naseer Ahmad

Abstract:

A Thermo-mechanical technique was developed to determine softening point temperature/glass transition temperature (Tg) of polystyrene exposed to high pressures. The design utilizes the ability of carbon dioxide to lower the glass transition temperature of polymers and acts as plasticizer. In this apparatus, the sorption of carbon dioxide to induce softening of polymers as a function of temperature/pressure is performed and the extent of softening is measured in three-point-flexural-bending mode. The polymer strip was placed in the cell in contact with the linear variable differential transformer (LVDT). CO2 was pumped into the cell from a supply cylinder to reach high pressure. The results clearly showed that full softening point of the samples, accompanied by a large deformation on the polymer strip. The deflection curves are initially relatively flat and then undergo a dramatic increase as the temperature is elevated. It was found that increasing the pressure of CO2 causes the temperature curves to shift from higher to lower by increment of about 45 K, over the pressure range of 0-120 bars. The obtained experimental Tg values were validated with the values reported in the literature. Finally, it is concluded that the defection model fits consistently to the generated experimental results, which attempts to describe in more detail how the central deflection of a thin polymer strip affected by the CO2 diffusions in the polymeric samples.

Keywords: Softening, high-pressure, polystyrene, CO2 diffusions.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 615
255 Enhanced GA-Fuzzy OPF under both Normal and Contingent Operation States

Authors: Ashish Saini, A.K. Saxena

Abstract:

The genetic algorithm (GA) based solution techniques are found suitable for optimization because of their ability of simultaneous multidimensional search. Many GA-variants have been tried in the past to solve optimal power flow (OPF), one of the nonlinear problems of electric power system. The issues like convergence speed and accuracy of the optimal solution obtained after number of generations using GA techniques and handling system constraints in OPF are subjects of discussion. The results obtained for GA-Fuzzy OPF on various power systems have shown faster convergence and lesser generation costs as compared to other approaches. This paper presents an enhanced GA-Fuzzy OPF (EGAOPF) using penalty factors to handle line flow constraints and load bus voltage limits for both normal network and contingency case with congestion. In addition to crossover and mutation rate adaptation scheme that adapts crossover and mutation probabilities for each generation based on fitness values of previous generations, a block swap operator is also incorporated in proposed EGA-OPF. The line flow limits and load bus voltage magnitude limits are handled by incorporating line overflow and load voltage penalty factors respectively in each chromosome fitness function. The effects of different penalty factors settings are also analyzed under contingent state.

Keywords: Contingent operation state, Fuzzy rule base, Genetic Algorithms, Optimal Power Flow.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1580
254 Design and Development of Optical Sensor Based Ground Reaction Force Measurement Platform for GAIT and Geriatric Studies

Authors: K. Chethana, A. S. Guru Prasad, S. N. Omkar, B. Vadiraj, S. Asokan

Abstract:

This paper describes an ab-initio design, development and calibration results of an Optical Sensor Ground Reaction Force Measurement Platform (OSGRFP) for gait and geriatric studies. The developed system employs an array of FBG sensors to measure the respective ground reaction forces from all three axes (X, Y and Z), which are perpendicular to each other. The novelty of this work is two folded. One is in its uniqueness to resolve the tri axial resultant forces during the stance in to the respective pure axis loads and the other is the applicability of inherently advantageous FBG sensors which are most suitable for biomechanical instrumentation. To validate the response of the FBG sensors installed in OSGRFP and to measure the cross sensitivity of the force applied in other directions, load sensors with indicators are used. Further in this work, relevant mathematical formulations are presented for extracting respective ground reaction forces from wavelength shifts/strain of FBG sensors on the OSGRFP. The result of this device has implications in understanding the foot function, identifying issues in gait cycle and measuring discrepancies between left and right foot. The device also provides a method to quantify and compare relative postural stability of different subjects under test, which has implications in post-surgical rehabilitation, geriatrics and optimizing training protocols for sports personnel.

Keywords: Balance, stability, Gait analysis, FBG applications, optical sensor ground reaction force platform.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1809
253 Technological Advancement in Fashion Online Retailing: A Comparative Study of Pakistan and UK Fashion E-Commerce

Authors: Sadia Idrees, Gianpaolo Vignali, Simeon Gill

Abstract:

The study aims to establish the virtual size and fit technology features to enhance fashion online retailing platforms, utilising digital human measurements to provide customised style and function to consumers. A few firms in the UK have launched advanced interactive fashion shopping domains for personalised shopping globally, aided by the latest internet technology. Virtual size and fit interfaces have a great potential to provide a personalised better-fitted garment to promote mass customisation globally. Made-to-measure clothing, consuming unstitched fabric is a common practice offered by fashion brands in Pakistan. This product is regarded as economical and sustainable to be utilised by consumers in Pakistan. Although the manual sizing system is practiced to sell garments online, virtual size and fit visualisation and recommendation technologies are uncommon in Pakistani fashion interfaces. A comparative assessment of Pakistani fashion brand websites and UK technology-driven fashion interfaces was conducted to highlight the vast potential of the virtual size and fit technology. The results indicated that web 2.0 technology adopted by Pakistani apparel brands has limited features, whereas companies practicing web 3.0 technology provide interactive online real-store shopping experience leading to enhanced customer satisfaction and globalisation of brands.

Keywords: E-commerce, mass customization, virtual size and fit, web 3.0 technology.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1061
252 Impact of Some Experimental Procedures on Behavioral Patterns and Physiological Traits of Rats

Authors: Amira, A. Goma, U. E. Mahrous

Abstract:

Welfare may be considered to be a subjective experience; it has a biological function that is related to the fitness and survival of the animal accordingly, researches have suggested that welfare is compromised when the animal's evolutionary fitness is reduced. This study was carried out to explain the effect of some managerial stressors as handling and restraint on behavioral patterns and biochemical parameters of rats. A total of 24 (12 males & 12 females) Sprague-Dawley rats (12 months & 150-180g) were allotted into 3 groups, handled group (4 male & 4 female), restrained group (4 male & 4 female) and control group (4 males & 4 females). The obtained results revealed that time spent feeding, drinking, movement and cage exploration frequencies increased significantly in handled rats than other groups, while lying time and licking increased significantly in restrained rats than handled and controls. Moreover, social behavior decreased in both stressed groups than control. Triglycerides were significantly increased in handled rats than other groups, while total lipid, total protein and globulin significantly increased in both treated groups than control. Corticosterone increased in restrained and handled rats than control ones. Moreover, there was an increment in packed cell volume significantly in restrained rats than others. These deducted that if we want to study the effect of stress on animal welfare it is necessary to study the effect of such stressors on animal’s behavior and physiological responses.

Keywords: Behavior, handling, restraint, rat, welfare.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2231
251 Microstructure and Corrosion Behavior of Laser Welded Magnesium Alloys with Silver Nanoparticles

Authors: M. Ishak, K. Yamasaki, K. Maekawa

Abstract:

Magnesium alloys have gained increased attention in recent years in automotive, electronics, and medical industry. This because of magnesium alloys have better properties than aluminum alloys and steels in respects of their low density and high strength to weight ratio. However, the main problems of magnesium alloy welding are the crack formation and the appearance of porosity during the solidification. This paper proposes a unique technique to weld two thin sheets of AZ31B magnesium alloy using a paste containing Ag nanoparticles. The paste containing Ag nanoparticles of 5 nm in average diameter and an organic solvent was used to coat the surface of AZ31B thin sheet. The coated sheet was heated at 100 °C for 60 s to evaporate the solvent. The dried sheet was set as a lower AZ31B sheet on the jig, and then lap fillet welding was carried out by using a pulsed Nd:YAG laser in a closed box filled with argon gas. The characteristics of the microstructure and the corrosion behavior of the joints were analyzed by opticalmicroscopy (OM), energy dispersive spectrometry (EDS), electron probe micro-analyzer (EPMA), scanning electron microscopy (SEM), and immersion corrosion test. The experimental results show that the wrought AZ31B magnesium alloy can be joined successfully using Ag nanoparticles. Ag nanoparticles insert promote grain refinement, narrower the HAZ width and wider bond width compared to weld without and insert. Corrosion rate of welded AZ31B with Ag nanoparticles reduced up to 44 % compared to base metal. The improvement of corrosion resistance of welded AZ31B with Ag nanoparticles due to finer grains and large grain boundaries area which consist of high Al content. β-phase Mg17Al12 could serve as effective barrier and suppressed further propagation of corrosion. Furthermore, Ag distribution in fusion zone provide much more finer grains and may stabilize the magnesium solid solution making it less soluble or less anodic in aqueous

Keywords: Laser welding, magnesium alloys, nanoparticles, mechanical property

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2034
250 Rapid Finite-Element Based Airport Pavement Moduli Solutions using Neural Networks

Authors: Kasthurirangan Gopalakrishnan, Marshall R. Thompson, Anshu Manik

Abstract:

This paper describes the use of artificial neural networks (ANN) for predicting non-linear layer moduli of flexible airfield pavements subjected to new generation aircraft (NGA) loading, based on the deflection profiles obtained from Heavy Weight Deflectometer (HWD) test data. The HWD test is one of the most widely used tests for routinely assessing the structural integrity of airport pavements in a non-destructive manner. The elastic moduli of the individual pavement layers backcalculated from the HWD deflection profiles are effective indicators of layer condition and are used for estimating the pavement remaining life. HWD tests were periodically conducted at the Federal Aviation Administration-s (FAA-s) National Airport Pavement Test Facility (NAPTF) to monitor the effect of Boeing 777 (B777) and Beoing 747 (B747) test gear trafficking on the structural condition of flexible pavement sections. In this study, a multi-layer, feed-forward network which uses an error-backpropagation algorithm was trained to approximate the HWD backcalculation function. The synthetic database generated using an advanced non-linear pavement finite-element program was used to train the ANN to overcome the limitations associated with conventional pavement moduli backcalculation. The changes in ANN-based backcalculated pavement moduli with trafficking were used to compare the relative severity effects of the aircraft landing gears on the NAPTF test pavements.

Keywords: Airfield pavements, ANN, backcalculation, newgeneration aircraft

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2150
249 Investigation on the Physical Conditions of Façade Systems of Campus Buildings by Infrared Thermography Tests

Authors: N. Türkmenoğlu Bayraktar, E. Kishalı

Abstract:

Campus buildings are educational facilities where various amount of energy consumption for lighting, heating, cooling and ventilation occurs. Some of the new universities in Turkey, where this investigation takes place, still continue their educational activities in existing buildings primarily designed for different architectural programs and converted to campus buildings via changes of function, space organizations and structural interventions but most of the time without consideration of appropriate micro climatic conditions. Reducing energy consumption in these structures not only contributes to the national economy but also mitigates the negative effects on environment. Furthermore, optimum thermal comfort conditions should be provided during the refurbishment of existing campus structures and their building envelope. Considering this issue, the first step is to investigate the climatic performance of building elements regarding refurbishment process. In the context of the study Kocaeli University, Faculty of Design and Architecture building constructed in 1980s in Anıtpark campus located in the central part of Kocaeli, Turkey was investigated. Climatic factors influencing thermal conditions; the deteriorations on building envelope; temperature distribution; heat losses from façade elements observed by thermography were presented in order to improve strategies for retrofit process for the building envelope. Within the scope of the survey, refurbishment strategies towards providing optimum climatic comfort conditions, increasing energy efficiency of building envelope were proposed.

Keywords: Building envelope, IRT, refurbishment, non-destructive test.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 841
248 Influence of Infrared Radiation on the Growth Rate of Microalgae Chlorella sorokiniana

Authors: Natalia Politaeva, Iuliia Smiatskaia, Iuliia Bazarnova, Iryna Atamaniuk, Kerstin Kuchta

Abstract:

Nowadays, the progressive decrease of primary natural resources and ongoing upward trend in terms of energy demand, have resulted in development of new generation technological processes which are focused on step-wise production and residues utilization. Thus, microalgae-based 3rd generation bioeconomy is considered one of the most promising approaches that allow production of value-added products and sophisticated utilization of residues biomass. In comparison to conventional biomass, microalgae can be cultivated in wide range of conditions without compromising food and feed production, and thus, addressing issues associated with negative social and environmental impacts. However, one of the most challenging tasks is to undergo seasonal variations and to achieve optimal growing conditions for indoor closed systems that can cover further demand for material and energetic utilization of microalgae. For instance, outdoor cultivation in St. Petersburg (Russia) is only suitable within rather narrow time frame (from mid-May to mid-September). At earlier and later periods, insufficient sunlight and heat for the growth of microalgae were detected. On the other hand, without additional physical effects, the biomass increment in summer is 3-5 times per week, depending on the solar radiation and the ambient temperature. In order to increase biomass production, scientists from all over the world have proposed various technical solutions for cultivators and have been studying the influence of various physical factors affecting biomass growth namely: magnetic field, radiation impact, and electric field, etc. In this paper, the influence of infrared radiation (IR) and fluorescent light on the growth rate of microalgae Chlorella sorokiniana has been studied. The cultivation of Chlorella sorokiniana was carried out in 500 ml cylindrical glass vessels, which were constantly aerated. To accelerate the cultivation process, the mixture was stirred for 15 minutes at 500 rpm following 120 minutes of rest time. At the same time, the metabolic needs in nutrients were provided by the addition of micro- and macro-nutrients in the microalgae growing medium. Lighting was provided by fluorescent lamps with the intensity of 2500 ± 300 lx. The influence of IR was determined using IR lamps with a voltage of 220 V, power of 250 W, in order to achieve the intensity of 13 600 ± 500 lx. The obtained results show that under the influence of fluorescent lamps along with the combined effect of active aeration and variable mixing, the biomass increment on the 2nd day was three times, and on the 7th day, it was eight-fold. The growth rate of microalgae under the influence of IR radiation was lower and has reached 22.6·106 cells·mL-1. However, application of IR lamps for the biomass growth allows maintaining the optimal temperature of microalgae suspension at approximately 25-28°C, which might especially be beneficial during the cold season in extreme climate zones.

Keywords: Biomass, fluorescent lamp, infrared radiation, microalgae.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 962
247 The “Ecological Approach” to GIS Implementation in Low Income Countries’ and the Role of Universities: Union of Municipalities of Joumeh Case Study

Authors: A. Iaaly, O. Jadayel, R. Jadayel

Abstract:

This paper explores the effectiveness of approaches used for the implementation of technology within central governments specifically Geographic Information Systems (GIS). It examines the extent to which various strategies to GIS implementation and its roll out to users within an organization is crucial for its long term assimilation. Depending on the contextual requirements, various implementation strategies exist spanning from the most revolutionary to the most evolutionary, which have an influence on the success of GIS projects and the realization of resulting business benefits within the central governments. This research compares between two strategies of GIS implementation within the Lebanese Municipalities. The first strategy is the “Technological Approach” which is focused on technology acquisition, overlaid on existing governmental frameworks. This approach gives minimal attention to capability building and the long term sustainability of the implemented program. The second strategy, referred to as the “Ecological Approach”, is naturally oriented to the function of the organization. This approach stresses on fostering the evolution of the program and on building the human capabilities. The Union of the Joumeh Municipalities will be presented as a case study under the “Ecological Approach” and the role of the GIS Center at the University of Balamand will be highlighted. Thus, this research contributes to the development of knowledge on technology implementation and the vital role of academia in the specific context of the Lebanese public sector so that this experience may pave the way for further applications.

Keywords: Ecological Approach, GIS, low income countries, technological approach.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1377
246 A Static Android Malware Detection Based on Actual Used Permissions Combination and API Calls

Authors: Xiaoqing Wang, Junfeng Wang, Xiaolan Zhu

Abstract:

Android operating system has been recognized by most application developers because of its good open-source and compatibility, which enriches the categories of applications greatly. However, it has become the target of malware attackers due to the lack of strict security supervision mechanisms, which leads to the rapid growth of malware, thus bringing serious safety hazards to users. Therefore, it is critical to detect Android malware effectively. Generally, the permissions declared in the AndroidManifest.xml can reflect the function and behavior of the application to a large extent. Since current Android system has not any restrictions to the number of permissions that an application can request, developers tend to apply more than actually needed permissions in order to ensure the successful running of the application, which results in the abuse of permissions. However, some traditional detection methods only consider the requested permissions and ignore whether it is actually used, which leads to incorrect identification of some malwares. Therefore, a machine learning detection method based on the actually used permissions combination and API calls was put forward in this paper. Meanwhile, several experiments are conducted to evaluate our methodology. The result shows that it can detect unknown malware effectively with higher true positive rate and accuracy while maintaining a low false positive rate. Consequently, the AdaboostM1 (J48) classification algorithm based on information gain feature selection algorithm has the best detection result, which can achieve an accuracy of 99.8%, a true positive rate of 99.6% and a lowest false positive rate of 0.

Keywords: Android, permissions combination, API calls, machine learning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1878