Search results for: Enhanced buffer
74 Designing a Fuzzy Logic Controller to Enhance Directional Stability of Vehicles under Difficult Maneuvers
Authors: Mehrdad N. Khajavi , Golamhassan Paygane, Ali Hakima
Abstract:
Vehicle which are turning or maneuvering at high speeds are susceptible to sliding and subsequently deviate from desired path. In this paper the dynamics governing the Yaw/Roll behavior of a vehicle has been simulated. Two different simulations have been used one for the real vehicle, for which a fuzzy controller is designed to increase its directional stability property. The other simulation is for a hypothetical vehicle with much higher tire cornering stiffness which is capable of developing the required lateral forces at the tire-ground patch contact to attain the desired lateral acceleration for the vehicle to follow the desired path without slippage. This simulation model is our reference model. The logic for keeping the vehicle on the desired track in the cornering or maneuvering state is to have some braking forces on the inner or outer tires based on the direction of vehicle deviation from the desired path. The inputs to our vehicle simulation model is steer angle δ and vehicle velocity V , and the outputs can be any kinematical parameters like yaw rate, yaw acceleration, side slip angle, rate of side slip angle and so on. The proposed fuzzy controller is a feed forward controller. This controller has two inputs which are steer angle δ and vehicle velocity V, and the output of the controller is the correcting moment M, which guides the vehicle back to the desired track. To develop the membership functions for the controller inputs and output and the fuzzy rules, the vehicle simulation has been run for 1000 times and the correcting moment have been determined by trial and error. Results of the vehicle simulation with fuzzy controller are very promising and show the vehicle performance is enhanced greatly over the vehicle without the controller. In fact the vehicle performance with the controller is very near the performance of the reference ideal model.Keywords: Vehicle, Directional Stability, Fuzzy Logic Controller, ANFIS..
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 151573 Training Undergraduate Engineering Students in Robotics and Automation through Model-Based Design Training: A Case Study at Assumption University of Thailand
Authors: Sajed A. Habib
Abstract:
Problem-based learning (PBL) is a student-centered pedagogy that originated in the medical field and has also been used extensively in other knowledge disciplines with recognized advantages and limitations. PBL has been used in various undergraduate engineering programs with mixed outcomes. The current fourth industrial revolution (digital era or Industry 4.0) has made it essential for many science and engineering students to receive effective training in advanced courses such as industrial automation and robotics. This paper presents a case study at Assumption University of Thailand, where a PBL-like approach was used to teach some aspects of automation and robotics to selected groups of undergraduate engineering students. These students were given some basic level training in automation prior to participating in a subsequent training session in order to solve technical problems with increased complexity. The participating students’ evaluation of the training sessions in terms of learning effectiveness, skills enhancement, and incremental knowledge following the problem-solving session was captured through a follow-up survey consisting of 14 questions and a 5-point scoring system. From the most recent training event, an overall 70% of the respondents indicated that their skill levels were enhanced to a much greater level than they had had before the training, whereas 60.4% of the respondents from the same event indicated that their incremental knowledge following the session was much greater than what they had prior to the training. The instructor-facilitator involved in the training events suggested that this method of learning was more suitable for senior/advanced level students than those at the freshmen level as certain skills to effectively participate in such problem-solving sessions are acquired over a period of time, and not instantly.
Keywords: Automation, industry 4.0, model-based design training, problem-based learning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 109872 Biocontrol Effectiveness of Indigenous Trichoderma Species against Meloidogyne javanica and Fusarium oxysporum f. sp. radicis lycopersici on Tomato
Authors: Hajji Lobna, Chattaoui Mayssa, Regaieg Hajer, M'Hamdi-Boughalleb Naima, Rhouma Ali, Horrigue-Raouani Najet
Abstract:
In this study, three local isolates of Trichoderma (Tr1: T. viride, Tr2: T. harzianum and Tr3: T. asperellum) were isolated and evaluated for their biocontrol effectiveness under in vitro conditions and in greenhouse. In vitro bioassay revealed a biopotential control against Fusarium oxysporum f. sp. radicis lycopersici and Meloidogyne javanica (RKN) separately. All species of Trichoderma exhibited biocontrol performance and (Tr1) Trichoderma viride was the most efficient. In fact, growth rate inhibition of Fusarium oxysporum f. sp. radicis lycopersici (FORL) was reached 75.5% with Tr1. Parasitism rate of root-knot nematode was 60% for juveniles and 75% for eggs with the same one. Pots experiment results showed that Tr1 and Tr2, compared to chemical treatment, enhanced the plant growth and exhibited better antagonism against root-knot nematode and root-rot fungi separated or combined. All Trichoderma isolates revealed a bioprotection potential against Fusarium oxysporum f. sp. radicis lycopersici. When pathogen fungi inoculated alone, Fusarium wilt index and browning vascular rate were reduced significantly with Tr1 (0.91, 2.38%) and Tr2 (1.5, 5.5%), respectively. In the case of combined infection with Fusarium and nematode, the same isolate of Trichoderma Tr1 and Tr2 decreased Fusarium wilt index at 1.1 and 0.83 and reduced the browning vascular rate at 6.5% and 6%, respectively. Similarly, the isolate Tr1 and Tr2 caused maximum inhibition of nematode multiplication. Multiplication rate was declined at 4% with both isolates either tomato infected by nematode separately or concomitantly with Fusarium. The chemical treatment was moderate in activity against Meloidogyne javanica and Fusarium oxysporum f. sp. radicis lycopersici alone and combined.Keywords: Trichoderma spp., Meloidogyne javanica, Fusarium oxysporum f.sp. radicis lycopersici, biocontrol.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 156671 Experimental and Numerical Study on the Effects of Oxygen Methane Flames with Water Dilution for Different Pressures
Authors: J. P. Chica Cano, G. Cabot, S. de Persis, F. Foucher
Abstract:
Among all possibilities to combat global warming, CO2 capture and sequestration (CCS) is presented as a great alternative to reduce greenhouse gas (GHG) emission. Several strategies for CCS from industrial and power plants are being considered. The concept of combined oxy-fuel combustion has been the most alternative solution. Nevertheless, due to the high cost of pure O2 production, additional ways recently emerged. In this paper, an innovative combustion process for a gas turbine cycle was studied: it was composed of methane combustion with oxygen enhanced air (OEA), exhaust gas recirculation (EGR) and H2O issuing from STIG (Steam Injection Gas Turbine), and the CO2 capture was realized by membrane separator. The effect on this combustion process was emphasized, and it was shown that a study of the influence of H2O dilution on the combustion parameters by experimental and numerical approaches had to be carried out. As a consequence, the laminar burning velocities measurements were performed in a stainless steel spherical combustion from atmospheric pressure to high pressure (up to 0.5 MPa), at 473 K for an equivalence ratio at 1. These experimental results were satisfactorily compared with Chemical Workbench v.4.1 package in conjunction with GRIMech 3.0 reaction mechanism. The good correlations so obtained between experimental and calculated flame speed velocities showed the validity of the GRIMech 3.0 mechanism in this domain of combustion: high H2O dilution, low N2, medium pressure. Finally, good estimations of flame speed and pollutant emissions were determined in other conditions compatible with real gas turbine. In particular, mixtures (composed of CH4/O2/N2/H2O/ or CO2) leading to the same adiabatic temperature were investigated. Influences of oxygen enrichment and H2O dilution (compared to CO2) were disused.
Keywords: CO2 capture, oxygen enrichment, water dilution, laminar burning velocity, pollutants emissions.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 88370 Kinetic Rate Comparison of Methane Catalytic Combustion of Palladium Catalysts Impregnated onto γ-Alumina and Bio-Char
Authors: Noor S. Nasri, Eric C. A. Tatt, Usman D. Hamza, Jibril Mohammed, Husna M. Zain
Abstract:
Catalytic combustion of methane is imperative due to stability of methane at low temperature. Methane (CH4), therefore, remains unconverted in vehicle exhausts thereby causing greenhouse gas GHG emission problem. In this study, heterogeneous catalysts of palladium with bio-char (2 wt% Pd/Bc) and Al2O3 (2wt% Pd/ Al2O3) supports were prepared by incipient wetness impregnation and then subsequently tested for catalytic combustion of CH4. Support-porous heterogeneous catalytic combustion (HCC) material were selected based on factors such as surface area, porosity, thermal stability, thermal conductivity, reactivity with reactants or products, chemical stability, catalytic activity, and catalyst life. Sustainable and renewable support-material of bio-mass char derived from palm shell waste material was compared with those from the conventional support-porous materials. Kinetic rate of reaction was determined for combustion of methane on Palladium (Pd) based catalyst with Al2O3 support and bio-char (Bc). Material characterization was done using TGA, SEM, and BET surface area. The performance test was accomplished using tubular quartz reactor with gas mixture ratio of 3% methane and 97% air. The methane porous-HCC conversion was carried out using online gas analyzer connected to the reactor that performed porous-HCC. BET surface area for prepared 2 wt% Pd/Bc is smaller than prepared 2wt% Pd/ Al2O3 due to its low porosity between particles. The order of catalyst activity based on kinetic rate on reaction of catalysts in low temperature was 2wt% Pd/Bc>calcined 2wt% Pd/ Al2O3> 2wt% Pd/ Al2O3>calcined 2wt% Pd/Bc. Hence agro waste material can successfully be utilized as an inexpensive catalyst support material for enhanced CH4 catalytic combustion.
Keywords: Catalytic-combustion, Environmental, Support-bio-char material, Sustainable, Renewable material.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 604069 Comparative Study of Calcium Content on in vitro Biological and Antibacterial Properties of Silicon-Based Bioglass
Authors: Morteza Elsa, Amirhossein Moghanian
Abstract:
The major aim of this study was to evaluate the effect of CaO content on in vitro hydroxyapatite formation, MC3T3 cells cytotoxicity and proliferation as well as antibacterial efficiency of sol-gel derived SiO2–CaO–P2O5 ternary system. For this purpose, first two grades of bioactive glass (BG); BG-58s (mol%: 60%SiO2–36%CaO–4%P2O5) and BG-68s (mol%: 70%SiO2–26%CaO–4%P2O5)) were synthesized by sol-gel method. Second, the effect of CaO content in their composition on in vitro bioactivity was investigated by soaking the BG-58s and BG-68s powders in simulated body fluid (SBF) for time periods up to 14 days and followed by characterization inductively coupled plasma atomic emission spectrometry (ICP-AES), Fourier transform infrared spectroscopy (FTIR), X-ray diffraction (XRD), and scanning electron microscopy (SEM) techniques. Additionally, live/dead staining, 3-(4,5dimethylthiazol-2-yl)-2,5-diphenyltetrazolium bromide (MTT), and alkaline phosphatase (ALP) activity assays were conducted respectively, as qualitatively and quantitatively assess for cell viability, proliferation and differentiations of MC3T3 cells in presence of 58s and 68s BGs. Results showed that BG-58s with higher CaO content showed higher in vitro bioactivity with respect to BG-68s. Moreover, the dissolution rate was inversely proportional to oxygen density of the BG. Live/dead assay revealed that both 58s and 68s increased the mean number live cells which were in good accordance with MTT assay. Furthermore, BG-58s showed more potential antibacterial activity against methicillin-resistant Staphylococcus aureus (MRSA) bacteria. Taken together, BG-58s with enhanced MC3T3 cells proliferation and ALP activity, acceptable bioactivity and significant high antibacterial effect against MRSA bacteria is suggested as a suitable candidate in order to further functionalizing for delivery of therapeutic ions and growth factors in bone tissue engineering.
Keywords: Antibacterial, bioactive glass, hydroxyapatite, proliferation, sol-gel processes.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 83668 Aqueous Extract of Flacourtia indica Prevents Carbon Tetrachloride Induced Hepatotoxicity in Rat
Authors: Gnanaprakash K, Madhusudhana Chetty C, Ramkanth S, Alagusundaram M, Tiruvengadarajan VS, Angala Parameswari S, Mohamed Saleem TS
Abstract:
Carbon tetrachloride (CCl4) is a well-known hepatotoxin and exposure to this chemical is known to induce oxidative stress and causes liver injury by the formation of free radicals. Flacourtia indica commonly known as 'Baichi' has been reported as an effective remedy for the treatment of a variety of diseases. The objective of this study was to investigate the hepatoprotective activity of aqueous extract of leaves of Flacourtia indica against CCl4 induced hepatotoxicity. Animals were pretreated with the aqueous extract of Flacourtia indica (250 & 500 mg/kg body weight) for one week and then challenged with CCl4 (1.5 ml/kg bw) in olive oil (1:1, v/v) on 7th day. Serum marker enzymes (ALP, AST, ALT, Total Protein & Total Bilirubin) and TBARS level (Marker for oxidative stress) were estimated in all the study groups. Alteration in the levels of biochemical markers of hepatic damage like AST, ALT, ALP, Total Protein, Total Bilirubin and lipid peroxides (TBARS) were tested in both CCl4 treated and extract treated groups. CCl4 has enhanced the AST, ALT, ALP and the Lipid peroxides (TBARS) in liver. Treatment of aqueous extract of Flacourtia indica leaves (250 & 500 mg/kg) exhibited a significant protective effect by altering the serum levels of AST, ALT, ALP, Total Protein, Total Bilirubin and liver TBARS. These biochemical observations were supported by histopathological study of liver sections. From this preliminary study it has been concluded that the aqueous extract of the leaves of Flacourtia indica protects liver against oxidative damages and could be used as an effective protector against CCl4 induced hepatic damage. Our findings suggested that Flacourtia indica possessed good hepatoprotective activityKeywords: Carbon Tetrachloride, Flacourtia indica, Hepatoprotective activity, Oxidative stress
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 218467 Susceptibility of Spodoptera littoralis, Field Populations in Egypt to Chlorantraniliprole and the Role of Detoxification Enzymes
Authors: Mohamed H. Khalifa, Fikry I. El-Shahawi, Nabil A. Mansour
Abstract:
The cotton leafworm, Spodoptera littoralis (Boisduval) is a major insect pest of vegetables and cotton crops in Egypt, and exhibits different levels of tolerance to certain insecticides. Chlorantraniliprole has been registered recently in Egypt for control this insect. The susceptibilities of three S. littoralis populations collected from El Behaira governorate, north Egypt to chlorantraniliprole were determined by leaf-dipping technique on 4th instar larvae. Obvious variation of toxicity was observed among the laboratory susceptible, and three field populations with LC50 values ranged between 1.53 µg/ml and 6.22 µg/ml. However, all the three field populations were less susceptible to chlorantraniliprole than a laboratory susceptible population. The most tolerant populations were sampled from El Delengat (ED) Province where S. littoralis had been frequently challenged by insecticides. Certain enzyme activity assays were carried out to be correlated with the mechanism of the observed field population tolerance. All field populations showed significantly enhanced activities of detoxification enzymes compared with the susceptible strain. The regression analysis between chlorantraniliprole toxicities and enzyme activities revealed that the highest correlation is between α-esterase or β-esterase (α-β-EST) activity and collected field strains susceptibility, otherwise this correlation is not significant (P > 0.05). Synergism assays showed the ED and susceptible strains could be synergized by known detoxification inhibitors such as piperonyl butoxide (PBO), triphenyl phosphate (TPP) and diethyl-maleate (DEM) at different levels (1.01-8.76-fold and 1.09-2.94 fold, respectively), TPP showed the maximum synergism in both strains. The results show that there is a correlation between the enzyme activity and tolerance, and carboxylic-esterase (Car-EST) is likely the main detoxification mechanism responsible for tolerance of S. littoralis to chlorantraniliprole.
Keywords: Chlorantraniliprole, detoxification enzymes, Egypt, Spodoptera littoralis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 145266 Appropriate Technology: Revisiting the Movement in Developing Countries for Sustainability
Authors: Jayshree Patnaik, Bhaskar Bhowmick
Abstract:
The economic growth of any nation is steered and dependent on innovation in technology. It can be preferably argued that technology has enhanced the quality of life. Technology is linked both with an economic and a social structure. But there are some parts of the world or communities which are yet to reap the benefits of technological innovation. Business and organizations are now well equipped with cutting-edge innovations that improve the firm performance and provide them with a competitive edge, but rarely does it have a positive impact on any community which is weak and marginalized. In recent times, it is observed that communities are actively handling social or ecological issues with the help of indigenous technologies. Thus, "Appropriate Technology" comes into the discussion, which is quite prevalent in the rural third world. Appropriate technology grew as a movement in the mid-1970s during the energy crisis, but it lost its stance in the following years when people started it to describe it as an inferior technology or dead technology. Basically, there is no such technology which is inferior or sophisticated for a particular region. The relevance of appropriate technology lies in penetrating technology into a larger and weaker section of community where the “Bottom of the pyramid” can pay for technology if they find the price is affordable. This is a theoretical paper which primarily revolves around how appropriate technology has faded and again evolved in both developed and developing countries. The paper will try to focus on the various concepts, history and challenges faced by the appropriate technology over the years. Appropriate technology follows a documented approach but lags in overall design and diffusion. Diffusion of technology into the poorer sections of community remains unanswered until the present time. Appropriate technology is multi-disciplinary in nature; therefore, this openness allows having a varied working model for different problems. Appropriate technology is a friendly technology that seeks to improve the lives of people in a constraint environment by providing an affordable and sustainable solution. Appropriate technology needs to be defined in the era of modern technological advancement for sustainability.
Keywords: Appropriate technology, community, developing country, sustainability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 186365 Carbamazepine Co-crystal Screening with Dicarboxylic Acids Co-Crystal Formers
Authors: Syarifah Abd Rahim, Fatinah Ab Rahman, Engku N. E. M. Nasir, Noor A. Ramle
Abstract:
Co-crystal is believed to improve the solubility and dissolution rates and thus, enhanced the bioavailability of poor water soluble drugs particularly during the oral route of administration. With the existing of poorly soluble drugs in pharmaceutical industry, the screening of co-crystal formation using carbamazepine (CBZ) as a model drug compound with dicarboxylic acids co-crystal formers (CCF) namely fumaric (FA) and succinic (SA) acids in ethanol has been studied. The co-crystal formations were studied by varying the mol ratio values of CCF to CBZ to access the effect of CCF concentration on the formation of the co-crystal. Solvent evaporation, slurry and cooling crystallization which representing the solution based method co-crystal screening were used. Based on the differential scanning calorimetry (DSC) analysis, the melting point of CBZ-SA in different ratio was in the range between 188oC-189oC. For CBZ-FA form A and CBZ-FA form B the melting point in different ratio were in the range of 174oC-175oC and 185oC-186oC respectively. The product crystal from the screening was also characterized using X-ray powder diffraction (XRPD). The XRPD pattern profile analysis has shown that the CBZ co-crystals with FA and SA were successfully formed for all ratios studied. The findings revealed that CBZ-FA co-crystal were formed in two different polymorphs. It was found that CBZ-FA form A and form B were formed from evaporation and slurry crystallization methods respectively. On the other hand, in cooling crystallization method, CBZ-FA form A was formed at lower mol ratio of CCF to CBZ and vice versa. This study disclosed that different methods and mol ratios during the co-crystal screening can affect the outcome of co-crystal produced such as polymorphic forms of co-crystal and thereof. Thus, it was suggested that careful attentions is needed during the screening since the co-crystal formation is currently one of the promising approach to be considered in research and development for pharmaceutical industry to improve the poorly soluble drugs.
Keywords: Carbamazepine, co-crystal, co-crystal former, dicarboxylic acid.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 290964 Nanomaterial Based Electrochemical Sensors for Endocrine Disrupting Compounds
Authors: Gaurav Bhanjana, Ganga Ram Chaudhary, Sandeep Kumar, Neeraj Dilbaghi
Abstract:
Main sources of endocrine disrupting compounds in the ecosystem are hormones, pesticides, phthalates, flame retardants, dioxins, personal-care products, coplanar polychlorinated biphenyls (PCBs), bisphenol A, and parabens. These endocrine disrupting compounds are responsible for learning disabilities, brain development problems, deformations of the body, cancer, reproductive abnormalities in females and decreased sperm count in human males. Although discharge of these chemical compounds into the environment cannot be stopped, yet their amount can be retarded through proper evaluation and detection techniques. The available techniques for determination of these endocrine disrupting compounds mainly include high performance liquid chromatography (HPLC), mass spectroscopy (MS) and gas chromatography-mass spectrometry (GC–MS). These techniques are accurate and reliable but have certain limitations like need of skilled personnel, time consuming, interference and requirement of pretreatment steps. Moreover, these techniques are laboratory bound and sample is required in large amount for analysis. In view of above facts, new methods for detection of endocrine disrupting compounds should be devised that promise high specificity, ultra sensitivity, cost effective, efficient and easy-to-operate procedure. Nowadays, electrochemical sensors/biosensors modified with nanomaterials are gaining high attention among researchers. Bioelement present in this system makes the developed sensors selective towards analyte of interest. Nanomaterials provide large surface area, high electron communication feature, enhanced catalytic activity and possibilities of chemical modifications. In most of the cases, nanomaterials also serve as an electron mediator or electrocatalyst for some analytes.Keywords: Sensors, endocrine disruptors, nanoparticles, electrochemical, microscopy.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 157663 Biogas Enhancement Using Iron Oxide Nanoparticles and Multi-Wall Carbon Nanotubes
Authors: John Justo Ambuchi, Zhaohan Zhang, Yujie Feng
Abstract:
Quick development and usage of nanotechnology have resulted to massive use of various nanoparticles, such as iron oxide nanoparticles (IONPs) and multi-wall carbon nanotubes (MWCNTs). Thus, this study investigated the role of IONPs and MWCNTs in enhancing bioenergy recovery. Results show that IONPs at a concentration of 750 mg/L and MWCNTs at a concentration of 1500 mg/L induced faster substrate utilization and biogas production rates than the control. IONPs exhibited higher carbon oxygen demand (COD) removal efficiency than MWCNTs while on the contrary, MWCNT performance on biogas generation was remarkable than IONPs. Furthermore, scanning electron microscopy (SEM) investigation revealed extracellular polymeric substances (EPS) excretion from AGS had an interaction with nanoparticles. This interaction created a protective barrier to microbial consortia hence reducing their cytotoxicity. Microbial community analyses revealed genus predominance of bacteria of Anaerolineaceae and Longilinea. Their role in biodegradation of the substrate could have highly been boosted by nanoparticles. The archaea predominance of the genus level of Methanosaeta and Methanobacterium enhanced methanation process. The presence of bacteria of genus Geobacter was also reported. Their presence might have significantly contributed to direct interspecies electron transfer in the system. Exposure of AGS to nanoparticles promoted direct interspecies electron transfer among the anaerobic fermenting bacteria and their counterpart methanogens during the anaerobic digestion process. This results provide useful insightful information in understanding the response of microorganisms to IONPs and MWCNTs in the complex natural environment.
Keywords: Anaerobic granular sludge, extracellular polymeric substances, iron oxide nanoparticles, multi-wall carbon nanotubes.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 212862 Influence of Organic Modifier Loading on Particle Dispersion of Biodegradable Polycaprolactone/Montmorillonite Nanocomposites
Authors: O. I. H. Dimitry, N. A. Mansour, A. L. G. Saad
Abstract:
Natural sodium montmorillonite (NaMMT), Cloisite Na+ and two organophilic montmorillonites (OMMTs), Cloisites 20A and 15A were used. Polycaprolactone (PCL)/MMT composites containing 1, 3, 5, and 10 wt% of Cloisite Na+ and PCL/OMMT nanocomposites containing 5 and 10 wt% of Cloisites 20A and 15A were prepared via solution intercalation technique to study the influence of organic modifier loading on particle dispersion of PCL/ NaMMT composites. Thermal stabilities of the obtained composites were characterized by thermal analysis using the thermogravimetric analyzer (TGA) which showed that in the presence of nitrogen flow the incorporation of 5 and 10 wt% of filler brings some decrease in PCL thermal stability in the sequence: Cloisite Na+>Cloisite 15A > Cloisite 20A, while in the presence of air flow these fillers scarcely influenced the thermoxidative stability of PCL by slightly accelerating the process. The interaction between PCL and silicate layers was studied by Fourier transform infrared (FTIR) spectroscopy which confirmed moderate interactions between nanometric silicate layers and PCL segments. The electrical conductivity (σ) which describes the ionic mobility of the systems was studied as a function of temperature and showed that σ of PCL was enhanced on increasing the modifier loading at filler content of 5 wt%, especially at higher temperatures in the sequence: Cloisite Na+<Cloisite 20A<Cloisite 15A, and was then decreased to some extent with a further increase to 10 wt%. The activation energy Eσ obtained from the dependency of σ on temperature using Arrhenius equation was found to be lowest for the nanocomposite containing 5 wt% of Cloisite 15A. The dispersed behavior of clay in PCL matrix was evaluated by X-ray diffraction (XRD) and scanning electron microscopy (SEM) analyses which revealed partial intercalated structures in PCL/NaMMT composites and semi-intercalated/semi-exfoliated structures in PCL/OMMT nanocomposites containing 5 wt% of Cloisite 20A or Cloisite 15A.Keywords: Polycaprolactone, organoclay, nanocomposite, montmorillonite, electrical conductivity, activation energy, exfoliation, intercalation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 112561 The Application of Line Balancing Technique and Simulation Program to Increase Productivity in Hard Disk Drive Components
Authors: Alonggot Limcharoen, Jintana Wannarat, Vorawat Panich
Abstract:
This study aims to investigate the balancing of the number of operators (Line Balancing technique) in the production line of hard disk drive components in order to increase efficiency. At present, the trend of using hard disk drives has continuously declined leading to limits in a company’s revenue potential. It is important to improve and develop the production process to create market share and to have the ability to compete with competitors with a higher value and quality. Therefore, an effective tool is needed to support such matters. In this research, the Arena program was applied to analyze the results both before and after the improvement. Finally, the precedent was used before proceeding with the real process. There were 14 work stations with 35 operators altogether in the RA production process where this study was conducted. In the actual process, the average production time was 84.03 seconds per product piece (by timing 30 times in each work station) along with a rating assessment by implementing the Westinghouse principles. This process showed that the rating was 123% underlying an assumption of 5% allowance time. Consequently, the standard time was 108.53 seconds per piece. The Takt time was calculated from customer needs divided by working duration in one day; 3.66 seconds per piece. Of these, the proper number of operators was 30 people. That meant five operators should be eliminated in order to increase the production process. After that, a production model was created from the actual process by using the Arena program to confirm model reliability; the outputs from imitation were compared with the original (actual process) and this comparison indicated that the same output meaning was reliable. Then, worker numbers and their job responsibilities were remodeled into the Arena program. Lastly, the efficiency of production process enhanced from 70.82% to 82.63% according to the target.
Keywords: Hard disk drive, line balancing, simulation, Arena program.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 118660 Meta Model Based EA for Complex Optimization
Authors: Maumita Bhattacharya
Abstract:
Evolutionary Algorithms are population-based, stochastic search techniques, widely used as efficient global optimizers. However, many real life optimization problems often require finding optimal solution to complex high dimensional, multimodal problems involving computationally very expensive fitness function evaluations. Use of evolutionary algorithms in such problem domains is thus practically prohibitive. An attractive alternative is to build meta models or use an approximation of the actual fitness functions to be evaluated. These meta models are order of magnitude cheaper to evaluate compared to the actual function evaluation. Many regression and interpolation tools are available to build such meta models. This paper briefly discusses the architectures and use of such meta-modeling tools in an evolutionary optimization context. We further present two evolutionary algorithm frameworks which involve use of meta models for fitness function evaluation. The first framework, namely the Dynamic Approximate Fitness based Hybrid EA (DAFHEA) model [14] reduces computation time by controlled use of meta-models (in this case approximate model generated by Support Vector Machine regression) to partially replace the actual function evaluation by approximate function evaluation. However, the underlying assumption in DAFHEA is that the training samples for the metamodel are generated from a single uniform model. This does not take into account uncertain scenarios involving noisy fitness functions. The second model, DAFHEA-II, an enhanced version of the original DAFHEA framework, incorporates a multiple-model based learning approach for the support vector machine approximator to handle noisy functions [15]. Empirical results obtained by evaluating the frameworks using several benchmark functions demonstrate their efficiencyKeywords: Meta model, Evolutionary algorithm, Stochastictechnique, Fitness function, Optimization, Support vector machine.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 206759 Human Factors Considerations in New Generation Fighter Planes to Enhance Combat Effectiveness
Authors: Chitra Rajagopal, Indra Deo Kumar, Ruchi Joshi, Binoy Bhargavan
Abstract:
Role of fighter planes in modern network centric military warfare scenarios has changed significantly in the recent past. New generation fighter planes have multirole capability of engaging both air and ground targets with high precision. Multirole aircraft undertakes missions such as Air to Air combat, Air defense, Air to Surface role (including Air interdiction, Close air support, Maritime attack, Suppression and Destruction of enemy air defense), Reconnaissance, Electronic warfare missions, etc. Designers have primarily focused on development of technologies to enhance the combat performance of the fighter planes and very little attention is given to human factor aspects of technologies. Unique physical and psychological challenges are imposed on the pilots to meet operational requirements during these missions. Newly evolved technologies have enhanced aircraft performance in terms of its speed, firepower, stealth, electronic warfare, situational awareness, and vulnerability reduction capabilities. This paper highlights the impact of emerging technologies on human factors for various military operations and missions. Technologies such as ‘cooperative knowledge-based systems’ to aid pilot’s decision making in military conflict scenarios as well as simulation technologies to enhance human performance is also studied as a part of research work. Current and emerging pilot protection technologies and systems which form part of the integrated life support systems in new generation fighter planes is discussed. System safety analysis application to quantify the human reliability in military operations is also studied.
Keywords: Combat effectiveness, emerging technologies, human factors, systems safety analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 121958 DNA of Hibiscus sabdariffa Damaged by Radiation from 900 MHz GSM Antenna
Authors: A. O. Oluwajobi, O. A. Falusi, N. A. Zubbair, T. Owoeye, F. Ladejobi, M. C. Dangana, A. Abubakar
Abstract:
The technology of mobile telephony has positively enhanced human life and reports on the bio safety of the radiation from their antennae have been contradictory, leading to serious litigations and violent protests by residents in several parts of the world. The crave for more information, as requested by WHO in order to resolve this issue, formed the basis for this study on the effect of the radiation from 900 MHz GSM antenna on the DNA of Hibiscus sabdariffa. Seeds of H. sabdariffa were raised in pots placed in three replicates at 100, 200, 300 and 400 metres from the GSM antennae in three selected test locations and a control where there was no GSM signal. Temperature (˚C) and the relative humidity (%) of study sites were measured for the period of study (24 weeks). Fresh young leaves were harvested from each plant at two, eight and twenty-four weeks after sowing and the DNA extracts were subjected to RAPD-PCR analyses. There were no significant differences between the weather conditions (temperature and relative humidity) in all the study locations. However, significant differences were observed in the intensities of radiations between the control (less than 0.02 V/m) and the test (0.40-1.01 V/m) locations. Data obtained showed that DNA of samples exposed to rays from GSM antenna had various levels of distortions, estimated at 91.67%. Distortions occurred in 58.33% of the samples between 2-8 weeks of exposure while 33.33% of the samples were distorted between 8-24 weeks exposure. Approximately 8.33% of the samples did not show distortions in DNA while 33.33% of the samples had their DNA damaged twice, both at 8 and at 24 weeks of exposure. The study showed that radiation from the 900 MHz GSM antenna is potent enough to cause distortions to DNA of H. sabdariffa even within 2-8 weeks of exposure. DNA damage was also independent of the distance from the antenna. These observations would qualify emissions from GSM mast as environmental hazard to the existence of plant biodiversities and all life forms in general. These results will trigger efforts to prevent further erosion of plant genetic resources which have been threatening food security and also the risks posed to living organisms, thereby making our environment very safe for our existence while we still continue to enjoy the benefits of the GSM technology.
Keywords: Damage, DNA, GSM antenna, radiation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 117557 FEM Models of Glued Laminated Timber Beams Enhanced by Bayesian Updating of Elastic Moduli
Authors: L. Melzerová, T. Janda, M. Šejnoha, J. Šejnoha
Abstract:
Two finite element (FEM) models are presented in this paper to address the random nature of the response of glued timber structures made of wood segments with variable elastic moduli evaluated from 3600 indentation measurements. This total database served to create the same number of ensembles as was the number of segments in the tested beam. Statistics of these ensembles were then assigned to given segments of beams and the Latin Hypercube Sampling (LHS) method was called to perform 100 simulations resulting into the ensemble of 100 deflections subjected to statistical evaluation. Here, a detailed geometrical arrangement of individual segments in the laminated beam was considered in the construction of two-dimensional FEM model subjected to in fourpoint bending to comply with the laboratory tests. Since laboratory measurements of local elastic moduli may in general suffer from a significant experimental error, it appears advantageous to exploit the full scale measurements of timber beams, i.e. deflections, to improve their prior distributions with the help of the Bayesian statistical method. This, however, requires an efficient computational model when simulating the laboratory tests numerically. To this end, a simplified model based on Mindlin’s beam theory was established. The improved posterior distributions show that the most significant change of the Young’s modulus distribution takes place in laminae in the most strained zones, i.e. in the top and bottom layers within the beam center region. Posterior distributions of moduli of elasticity were subsequently utilized in the 2D FEM model and compared with the original simulations.
Keywords: Bayesian inference, FEM, four point bending test, laminated timber, parameter estimation, prior and posterior distribution, Young’s modulus.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 221756 Buckling Optimization of Radially-Graded, Thin-Walled, Long Cylinders under External Pressure
Authors: Karam Y. Maalawi
Abstract:
This paper presents a generalized formulation for the problem of buckling optimization of anisotropic, radially graded, thin-walled, long cylinders subject to external hydrostatic pressure. The main structure to be analyzed is built of multi-angle fibrous laminated composite lay-ups having different volume fractions of the constituent materials within the individual plies. This yield to a piecewise grading of the material in the radial direction; that is the physical and mechanical properties of the composite material are allowed to vary radially. The objective function is measured by maximizing the critical buckling pressure while preserving the total structural mass at a constant value equals to that of a baseline reference design. In the selection of the significant optimization variables, the fiber volume fractions adjoin the standard design variables including fiber orientation angles and ply thicknesses. The mathematical formulation employs the classical lamination theory, where an analytical solution that accounts for the effective axial and flexural stiffness separately as well as the inclusion of the coupling stiffness terms is presented. The proposed model deals with dimensionless quantities in order to be valid for thin shells having arbitrary thickness-to-radius ratios. The critical buckling pressure level curves augmented with the mass equality constraint are given for several types of cylinders showing the functional dependence of the constrained objective function on the selected design variables. It was shown that material grading can have significant contribution to the whole optimization process in achieving the required structural designs with enhanced stability limits.Keywords: Buckling instability, structural optimization, functionally graded material, laminated cylindrical shells, externalhydrostatic pressure.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 235955 Treatment of Low-Grade Iron Ore Using Two Stage Wet High-Intensity Magnetic Separation Technique
Authors: Moses C. Siame, Kazutoshi Haga, Atsushi Shibayama
Abstract:
This study investigates the removal of silica, alumina and phosphorus as impurities from Sanje iron ore using wet high-intensity magnetic separation (WHIMS). Sanje iron ore contains low-grade hematite ore found in Nampundwe area of Zambia from which iron is to be used as the feed in the steelmaking process. The chemical composition analysis using X-ray Florence spectrometer showed that Sanje low-grade ore contains 48.90 mass% of hematite (Fe2O3) with 34.18 mass% as an iron grade. The ore also contains silica (SiO2) and alumina (Al2O3) of 31.10 mass% and 7.65 mass% respectively. The mineralogical analysis using X-ray diffraction spectrometer showed hematite and silica as the major mineral components of the ore while magnetite and alumina exist as minor mineral components. Mineral particle distribution analysis was done using scanning electron microscope with an X-ray energy dispersion spectrometry (SEM-EDS) and images showed that the average mineral size distribution of alumina-silicate gangue particles is in order of 100 μm and exists as iron-bearing interlocked particles. Magnetic separation was done using series L model 4 Magnetic Separator. The effect of various magnetic separation parameters such as magnetic flux density, particle size, and pulp density of the feed was studied during magnetic separation experiments. The ore with average particle size of 25 µm and pulp density of 2.5% was concentrated using pulp flow of 7 L/min. The results showed that 10 T was optimal magnetic flux density which enhanced the recovery of 93.08% of iron with 53.22 mass% grade. The gangue mineral particles containing 12 mass% silica and 3.94 mass% alumna remained in the concentrate, therefore the concentrate was further treated in the second stage WHIMS using the same parameters from the first stage. The second stage process recovered 83.41% of iron with 67.07 mass% grade. Silica was reduced to 2.14 mass% and alumina to 1.30 mass%. Accordingly, phosphorus was also reduced to 0.02 mass%. Therefore, the two stage magnetic separation process was established using these results.
Keywords: Sanje iron ore, magnetic separation, silica, alumina, recovery.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 127254 Web Data Scraping Technology Using Term Frequency Inverse Document Frequency to Enhance the Big Data Quality on Sentiment Analysis
Authors: Sangita Pokhrel, Nalinda Somasiri, Rebecca Jeyavadhanam, Swathi Ganesan
Abstract:
Tourism is a booming industry with huge future potential for global wealth and employment. There are countless data generated over social media sites every day, creating numerous opportunities to bring more insights to decision-makers. The integration of big data technology into the tourism industry will allow companies to conclude where their customers have been and what they like. This information can then be used by businesses, such as those in charge of managing visitor centres or hotels, etc., and the tourist can get a clear idea of places before visiting. The technical perspective of natural language is processed by analysing the sentiment features of online reviews from tourists, and we then supply an enhanced long short-term memory (LSTM) framework for sentiment feature extraction of travel reviews. We have constructed a web review database using a crawler and web scraping technique for experimental validation to evaluate the effectiveness of our methodology. The text form of sentences was first classified through VADER and RoBERTa model to get the polarity of the reviews. In this paper, we have conducted study methods for feature extraction, such as Count Vectorization and Term Frequency – Inverse Document Frequency (TFIDF) Vectorization and implemented Convolutional Neural Network (CNN) classifier algorithm for the sentiment analysis to decide if the tourist’s attitude towards the destinations is positive, negative, or simply neutral based on the review text that they posted online. The results demonstrated that from the CNN algorithm, after pre-processing and cleaning the dataset, we received an accuracy of 96.12% for the positive and negative sentiment analysis.
Keywords: Counter vectorization, Convolutional Neural Network, Crawler, data technology, Long Short-Term Memory, LSTM, Web Scraping, sentiment analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17553 Toward Indoor and Outdoor Surveillance Using an Improved Fast Background Subtraction Algorithm
Authors: A. El Harraj, N. Raissouni
Abstract:
The detection of moving objects from a video image sequences is very important for object tracking, activity recognition, and behavior understanding in video surveillance. The most used approach for moving objects detection / tracking is background subtraction algorithms. Many approaches have been suggested for background subtraction. But, these are illumination change sensitive and the solutions proposed to bypass this problem are time consuming. In this paper, we propose a robust yet computationally efficient background subtraction approach and, mainly, focus on the ability to detect moving objects on dynamic scenes, for possible applications in complex and restricted access areas monitoring, where moving and motionless persons must be reliably detected. It consists of three main phases, establishing illumination changes invariance, background/foreground modeling and morphological analysis for noise removing. We handle illumination changes using Contrast Limited Histogram Equalization (CLAHE), which limits the intensity of each pixel to user determined maximum. Thus, it mitigates the degradation due to scene illumination changes and improves the visibility of the video signal. Initially, the background and foreground images are extracted from the video sequence. Then, the background and foreground images are separately enhanced by applying CLAHE. In order to form multi-modal backgrounds we model each channel of a pixel as a mixture of K Gaussians (K=5) using Gaussian Mixture Model (GMM). Finally, we post process the resulting binary foreground mask using morphological erosion and dilation transformations to remove possible noise. For experimental test, we used a standard dataset to challenge the efficiency and accuracy of the proposed method on a diverse set of dynamic scenes.
Keywords: Video surveillance, background subtraction, Contrast Limited Histogram Equalization, illumination invariance, object tracking, object detection, behavior understanding, dynamic scenes.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 208752 GridNtru: High Performance PKCS
Authors: Narasimham Challa, Jayaram Pradhan
Abstract:
Cryptographic algorithms play a crucial role in the information society by providing protection from unauthorized access to sensitive data. It is clear that information technology will become increasingly pervasive, Hence we can expect the emergence of ubiquitous or pervasive computing, ambient intelligence. These new environments and applications will present new security challenges, and there is no doubt that cryptographic algorithms and protocols will form a part of the solution. The efficiency of a public key cryptosystem is mainly measured in computational overheads, key size and bandwidth. In particular the RSA algorithm is used in many applications for providing the security. Although the security of RSA is beyond doubt, the evolution in computing power has caused a growth in the necessary key length. The fact that most chips on smart cards can-t process key extending 1024 bit shows that there is need for alternative. NTRU is such an alternative and it is a collection of mathematical algorithm based on manipulating lists of very small integers and polynomials. This allows NTRU to high speeds with the use of minimal computing power. NTRU (Nth degree Truncated Polynomial Ring Unit) is the first secure public key cryptosystem not based on factorization or discrete logarithm problem. This means that given sufficient computational resources and time, an adversary, should not be able to break the key. The multi-party communication and requirement of optimal resource utilization necessitated the need for the present day demand of applications that need security enforcement technique .and can be enhanced with high-end computing. This has promoted us to develop high-performance NTRU schemes using approaches such as the use of high-end computing hardware. Peer-to-peer (P2P) or enterprise grids are proven as one of the approaches for developing high-end computing systems. By utilizing them one can improve the performance of NTRU through parallel execution. In this paper we propose and develop an application for NTRU using enterprise grid middleware called Alchemi. An analysis and comparison of its performance for various text files is presented.Keywords: Alchemi, GridNtru, Ntru, PKCS.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 169151 A Preliminary X-Ray Study on Human-Hair Microstructures for a Health-State Indicator
Authors: Phannee Saengkaew, Weerasak Ussawawongaraya, Sasiphan Khaweerat, Supagorn Rugmai, Sirisart Ouajai, Jiraporn Luengviriya, Sakuntam Sanorpim, Manop Tirarattanasompot, Somboon Rhianphumikarakit
Abstract:
We present a preliminary x-ray study on human-hair microstructures for a health-state indicator, in particular a cancer case. As an uncomplicated and low-cost method of x-ray technique, the human-hair microstructure was analyzed by wide-angle x-ray diffractions (XRD) and small-angle x-ray scattering (SAXS). The XRD measurements exhibited the simply reflections at the d-spacing of 28 Å, 9.4 Å and 4.4 Å representing to the periodic distance of the protein matrix of the human-hair macrofibrous and the diameter and the repeated spacing of the polypeptide alpha helixes of the photofibrils of the human-hair microfibrous, respectively. When compared to the normal cases, the unhealthy cases including to the breast- and ovarian-cancer cases obtained higher normalized ratios of the x-ray diffracting peaks of 9.4 Å and 4.4 Å. This likely resulted from the varied distributions of microstructures by a molecular alteration. As an elemental analysis by x-ray fluorescence (XRF), the normalized quantitative ratios of zinc(Zn)/calcium(Ca) and iron(Fe)/calcium(Ca) were determined. Analogously, both Zn/Ca and Fe/Ca ratios of the unhealthy cases were obtained higher than both of the normal cases were. Combining the structural analysis by XRD measurements and the elemental analysis by XRF measurements exhibited that the modified fibrous microstructures of hair samples were in relation to their altered elemental compositions. Therefore, these microstructural and elemental analyses of hair samples will be benefit to associate with a diagnosis of cancer and genetic diseases. This functional method would lower a risk of such diseases by the early diagnosis. However, the high-intensity x-ray source, the highresolution x-ray detector, and more hair samples are necessarily desired to develop this x-ray technique and the efficiency would be enhanced by including the skin and fingernail samples with the human-hair analysis.Keywords: Human-hair analysis, XRD, SAXS, breast cancer, health-state indicator
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 257450 A Numerical Model for Simulation of Blood Flow in Vascular Networks
Authors: Houman Tamaddon, Mehrdad Behnia, Masud Behnia
Abstract:
An accurate study of blood flow is associated with an accurate vascular pattern and geometrical properties of the organ of interest. Due to the complexity of vascular networks and poor accessibility in vivo, it is challenging to reconstruct the entire vasculature of any organ experimentally. The objective of this study is to introduce an innovative approach for the reconstruction of a full vascular tree from available morphometric data. Our method consists of implementing morphometric data on those parts of the vascular tree that are smaller than the resolution of medical imaging methods. This technique reconstructs the entire arterial tree down to the capillaries. Vessels greater than 2 mm are obtained from direct volume and surface analysis using contrast enhanced computed tomography (CT). Vessels smaller than 2mm are reconstructed from available morphometric and distensibility data and rearranged by applying Murray’s Laws. Implementation of morphometric data to reconstruct the branching pattern and applying Murray’s Laws to every vessel bifurcation simultaneously, lead to an accurate vascular tree reconstruction. The reconstruction algorithm generates full arterial tree topography down to the first capillary bifurcation. Geometry of each order of the vascular tree is generated separately to minimize the construction and simulation time. The node-to-node connectivity along with the diameter and length of every vessel segment is established and order numbers, according to the diameter-defined Strahler system, are assigned. During the simulation, we used the averaged flow rate for each order to predict the pressure drop and once the pressure drop is predicted, the flow rate is corrected to match the computed pressure drop for each vessel. The final results for 3 cardiac cycles is presented and compared to the clinical data.
Keywords: Blood flow, Morphometric data, Vascular tree, Strahler ordering system.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 210149 Collaborative Stylistic Group Project: A Drama Practical Analysis Application
Authors: Omnia F. Elkommos
Abstract:
In the course of teaching stylistics to undergraduate students of the Department of English Language and Literature, Faculty of Arts and Humanities, the linguistic tool kit of theories comes in handy and useful for the better understanding of the different literary genres: Poetry, drama, and short stories. In the present paper, a model of teaching of stylistics is compiled and suggested. It is a collaborative group project technique for use in the undergraduate diverse specialisms (Literature, Linguistics and Translation tracks) class. Students initially are introduced to the different linguistic tools and theories suitable for each literary genre. The second step is to apply these linguistic tools to texts. Students are required to watch videos performing the poems or play, for example, and search the net for interpretations of the texts by other authorities. They should be using a template (prepared by the researcher) that has guided questions leading students along in their analysis. Finally, a practical analysis would be written up using the practical analysis essay template (also prepared by the researcher). As per collaborative learning, all the steps include activities that are student-centered addressing differentiation and considering their three different specialisms. In the process of selecting the proper tools, the actual application and analysis discussion, students are given tasks that request their collaboration. They also work in small groups and the groups collaborate in seminars and group discussions. At the end of the course/module, students present their work also collaboratively and reflect and comment on their learning experience. The module/course uses a drama play that lends itself to the task: ‘The Bond’ by Amy Lowell and Robert Frost. The project results in an interpretation of its theme, characterization and plot. The linguistic tools are drawn from pragmatics, and discourse analysis among others.
Keywords: Applied linguistic theories, collaborative learning, cooperative principle, discourse analysis, drama analysis, group project, online acting performance, pragmatics, speech act theory, stylistics, technology enhanced learning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 107748 Using the Monte Carlo Simulation to Predict the Assembly Yield
Authors: C. Chahin, M. C. Hsu, Y. H. Lin, C. Y. Huang
Abstract:
Electronics Products that achieve high levels of integrated communications, computing and entertainment, multimedia features in small, stylish and robust new form factors are winning in the market place. Due to the high costs that an industry may undergo and how a high yield is directly proportional to high profits, IC (Integrated Circuit) manufacturers struggle to maximize yield, but today-s customers demand miniaturization, low costs, high performance and excellent reliability making the yield maximization a never ending research of an enhanced assembly process. With factors such as minimum tolerances, tighter parameter variations a systematic approach is needed in order to predict the assembly process. In order to evaluate the quality of upcoming circuits, yield models are used which not only predict manufacturing costs but also provide vital information in order to ease the process of correction when the yields fall below expectations. For an IC manufacturer to obtain higher assembly yields all factors such as boards, placement, components, the material from which the components are made of and processes must be taken into consideration. Effective placement yield depends heavily on machine accuracy and the vision of the system which needs the ability to recognize the features on the board and component to place the device accurately on the pads and bumps of the PCB. There are currently two methods for accurate positioning, using the edge of the package and using solder ball locations also called footprints. The only assumption that a yield model makes is that all boards and devices are completely functional. This paper will focus on the Monte Carlo method which consists in a class of computational algorithms (information processed algorithms) which depends on repeated random samplings in order to compute the results. This method utilized in order to recreate the simulation of placement and assembly processes within a production line.
Keywords: Monte Carlo simulation, placement yield, PCBcharacterization, electronics assembly
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 216647 Corporate Governance and Corporate Social Responsibility: Research on the Interconnection of Both Concepts and Its Impact on Non-Profit Organizations
Authors: Helene Eller
Abstract:
The aim of non-profit organizations (NPO) is to provide services and goods for its clientele, with profit being a minor objective. By having this definition as the basic purpose of doing business, it is obvious that the goal of an organisation is to serve several bottom lines and not only the financial one. This approach is underpinned by the non-distribution constraint which means that NPO are allowed to make profits to a certain extent, but not to distribute them. The advantage is that there are no single shareholders who might have an interest in the prosperity of the organisation: there is no pie to divide. The gained profits remain within the organisation and will be reinvested in purposeful projects. Good governance is mandatory to support the aim of NPOs. Looking for a measure of good governance the principals of corporate governance (CG) will come in mind. The purpose of CG is direction and control, and in the field of NPO, CG is enlarged to consider the relationship to all important stakeholders who have an impact on the organisation. The recognition of more relevant parties than the shareholder is the link to corporate social responsibility (CSR). It supports a broader view of the bottom line: It is no longer enough to know how profits are used but rather how they are made. Besides, CSR addresses the responsibility of organisations for their impact on society. When transferring the concept of CSR to the non-profit area it will become obvious that CSR with its distinctive features will match the aims of NPOs. As a consequence, NPOs who apply CG apply also CSR to a certain extent. The research is designed as a comprehensive theoretical and empirical analysis. First, the investigation focuses on the theoretical basis of both concepts. Second, the similarities and differences are outlined and as a result the interconnection of both concepts will show up. The contribution of this research is manifold: The interconnection of both concepts when applied to NPOs has not got any attention in science yet. CSR and governance as integrated concept provides a lot of advantages for NPOs compared to for-profit organisations which are in a steady justification to show the impact they might have on the society. NPOs, however, integrate economic and social aspects as starting point. For NPOs CG is not a mere concept of compliance but rather an enhanced concept integrating a lot of aspects of CSR. There is no “either-nor” between the concepts for NPOs.
Keywords: Business ethics, corporate governance, corporate social responsibility, non-profit organisations, stakeholder theory.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 195146 Relation of Optimal Pilot Offsets in the Shifted Constellation-Based Method for the Detection of Pilot Contamination Attacks
Authors: Dimitriya A. Mihaylova, Zlatka V. Valkova-Jarvis, Georgi L. Iliev
Abstract:
One possible approach for maintaining the security of communication systems relies on Physical Layer Security mechanisms. However, in wireless time division duplex systems, where uplink and downlink channels are reciprocal, the channel estimate procedure is exposed to attacks known as pilot contamination, with the aim of having an enhanced data signal sent to the malicious user. The Shifted 2-N-PSK method involves two random legitimate pilots in the training phase, each of which belongs to a constellation, shifted from the original N-PSK symbols by certain degrees. In this paper, legitimate pilots’ offset values and their influence on the detection capabilities of the Shifted 2-N-PSK method are investigated. As the implementation of the technique depends on the relation between the shift angles rather than their specific values, the optimal interconnection between the two legitimate constellations is investigated. The results show that no regularity exists in the relation between the pilot contamination attacks (PCA) detection probability and the choice of offset values. Therefore, an adversary who aims to obtain the exact offset values can only employ a brute-force attack but the large number of possible combinations for the shifted constellations makes such a type of attack difficult to successfully mount. For this reason, the number of optimal shift value pairs is also studied for both 100% and 98% probabilities of detecting pilot contamination attacks. Although the Shifted 2-N-PSK method has been broadly studied in different signal-to-noise ratio scenarios, in multi-cell systems the interference from the signals in other cells should be also taken into account. Therefore, the inter-cell interference impact on the performance of the method is investigated by means of a large number of simulations. The results show that the detection probability of the Shifted 2-N-PSK decreases inversely to the signal-to-interference-plus-noise ratio.
Keywords: Channel estimation, inter-cell interference, pilot contamination attacks, wireless communications.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 67745 In vivo Antidiabetic and Antioxidant Potential of Pseudovaria macrophylla Extract
Authors: Aditya Arya, Hairin Taha, Ataul Karim Khan, Nayiar Shahid, Hapipah Mohd Ali, Mustafa Ali Mohd
Abstract:
This study has investigated the antidiabetic and antioxidant potential of Pseudovaria macrophylla bark extract on streptozotocin–nicotinamide induced type 2 diabetic rats. LCMSQTOF and NMR experiments were done to determine the chemical composition in the methanolic bark extract. For in vivo experiments, the STZ (60 mg/kg/b.w, 15 min after 120 mg/kg/1 nicotinamide, i.p.) induced diabetic rats were treated with methanolic extract of Pseuduvaria macrophylla (200 and 400 mg/kg·bw) and glibenclamide (2.5 mg/kg) as positive control respectively. Biochemical parameters were assayed in the blood samples of all groups of rats. The pro-inflammatory cytokines, antioxidant status and plasma transforming growth factor βeta-1 (TGF-β1) were evaluated. The histological study of the pancreas was examined and its expression level of insulin was observed by immunohistochemistry. In addition, the expression of glucose transporters (GLUT 1, 2 and 4) were assessed in pancreas tissue by western blot analysis. The outcomes of the study displayed that the bark methanol extract of Pseuduvaria macrophylla has potentially normalized the elevated blood glucose levels and improved serum insulin and C-peptide levels with significant increase in the antioxidant enzyme, reduced glutathione (GSH) and decrease in the level of lipid peroxidation (LPO). Additionally, the extract has markedly decreased the levels of serum pro-inflammatory cytokines and transforming growth factor beta-1 (TGF-β1). Histopathology analysis demonstrated that Pseuduvaria macrophylla has the potential to protect the pancreas of diabetic rats against peroxidation damage by downregulating oxidative stress and elevated hyperglycaemia. Furthermore, the expression of insulin protein, GLUT-1, GLUT-2 and GLUT-4 in pancreatic cells was enhanced. The findings of this study support the anti-diabetic claims of Pseudovaria macrophylla bark.
Keywords: Diabetes mellitus, Pseuduvaria macrophylla, alkaloids, caffeic acid.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2761