Search results for: thermal processing
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 6790

Search results for: thermal processing

1540 Novel Hole-Bar Standard Design and Inter-Comparison for Geometric Errors Identification on Machine-Tool

Authors: F. Viprey, H. Nouira, S. Lavernhe, C. Tournier

Abstract:

Manufacturing of freeform parts may be achieved on 5-axis machine tools currently considered as a common means of production. In particular, the geometrical quality of the freeform parts depends on the accuracy of the multi-axis structural loop, which is composed of several component assemblies maintaining the relative positioning between the tool and the workpiece. Therefore, to reach high quality of the geometries of the freeform parts the geometric errors of the 5 axis machine should be evaluated and compensated, which leads one to master the deviations between the tool and the workpiece (volumetric accuracy). In this study, a novel hole-bar design was developed and used for the characterization of the geometric errors of a RRTTT 5-axis machine tool. The hole-bar standard design is made of Invar material, selected since it is less sensitive to thermal drift. The proposed design allows once to extract 3 intrinsic parameters: one linear positioning and two straightnesses. These parameters can be obtained by measuring the cylindricity of 12 holes (bores) and 11 cylinders located on a perpendicular plane. By mathematical analysis, twelve 3D points coordinates can be identified and correspond to the intersection of each hole axis with the least square plane passing through two perpendicular neighbour cylinders axes. The hole-bar was calibrated using a precision CMM at LNE traceable the SI meter definition. The reversal technique was applied in order to separate the error forms of the hole bar from the motion errors of the mechanical guiding systems. An inter-comparison was additionally conducted between four NMIs (National Metrology Institutes) within the EMRP IND62: JRP-TIM project. Afterwards, the hole-bar was integrated in RRTTT 5-axis machine tool to identify its volumetric errors. Measurements were carried out in real time and combine raw data acquired by the Renishaw RMP600 touch probe and the linear and rotary encoders. The geometric errors of the 5 axis machine were also evaluated by an accurate laser tracer interferometer system. The results were compared to those obtained with the hole bar.

Keywords: volumetric errors, CMM, 3D hole-bar, inter-comparison

Procedia PDF Downloads 362
1539 Perceiving Casual Speech: A Gating Experiment with French Listeners of L2 English

Authors: Naouel Zoghlami

Abstract:

Spoken-word recognition involves the simultaneous activation of potential word candidates which compete with each other for final correct recognition. In continuous speech, the activation-competition process gets more complicated due to speech reductions existing at word boundaries. Lexical processing is more difficult in L2 than in L1 because L2 listeners often lack phonetic, lexico-semantic, syntactic, and prosodic knowledge in the target language. In this study, we investigate the on-line lexical segmentation hypotheses that French listeners of L2 English form and then revise as subsequent perceptual evidence is revealed. Our purpose is to shed further light on the processes of L2 spoken-word recognition in context and better understand L2 listening difficulties through a comparison of skilled and unskilled reactions at the point where their working hypothesis is rejected. We use a variant of the gating experiment in which subjects transcribe an English sentence presented in increments of progressively greater duration. The spoken sentence was “And this amazing athlete has just broken another world record”, chosen mainly because it included common reductions and phonetic features in English, such as elision and assimilation. Our preliminary results show that there is an important difference in the manner in which proficient and less-proficient L2 listeners handle connected speech. Less-proficient listeners delay recognition of words as they wait for lexical and syntactic evidence to appear in the gates. Further statistical results are currently being undertaken.

Keywords: gating paradigm, spoken word recognition, online lexical segmentation, L2 listening

Procedia PDF Downloads 443
1538 Denoising Transient Electromagnetic Data

Authors: Lingerew Nebere Kassie, Ping-Yu Chang, Hsin-Hua Huang, , Chaw-Son Chen

Abstract:

Transient electromagnetic (TEM) data plays a crucial role in hydrogeological and environmental applications, providing valuable insights into geological structures and resistivity variations. However, the presence of noise often hinders the interpretation and reliability of these data. Our study addresses this issue by utilizing a FASTSNAP system for the TEM survey, which operates at different modes (low, medium, and high) with continuous adjustments to discretization, gain, and current. We employ a denoising approach that processes the raw data obtained from each acquisition mode to improve signal quality and enhance data reliability. We use a signal-averaging technique for each mode, increasing the signal-to-noise ratio. Additionally, we utilize wavelet transform to suppress noise further while preserving the integrity of the underlying signals. This approach significantly improves the data quality, notably suppressing severe noise at late times. The resulting denoised data exhibits a substantially improved signal-to-noise ratio, leading to increased accuracy in parameter estimation. By effectively denoising TEM data, our study contributes to a more reliable interpretation and analysis of underground structures. Moreover, the proposed denoising approach can be seamlessly integrated into existing ground-based TEM data processing workflows, facilitating the extraction of meaningful information from noisy measurements and enhancing the overall quality and reliability of the acquired data.

Keywords: data quality, signal averaging, transient electromagnetic, wavelet transform

Procedia PDF Downloads 60
1537 The Internet of Things: A Survey of Authentication Mechanisms, and Protocols, for the Shifting Paradigm of Communicating, Entities

Authors: Nazli Hardy

Abstract:

Multidisciplinary application of computer science, interactive database-driven web application, the Internet of Things (IoT) represents a digital ecosystem that has pervasive technological, social, and economic, impact on the human population. It is a long-term technology, and its development is built around the connection of everyday objects, to the Internet. It is estimated that by 2020, with billions of people connected to the Internet, the number of connected devices will exceed 50 billion, and thus IoT represents a paradigm shift in in our current interconnected ecosystem, a communication shift that will unavoidably affect people, businesses, consumers, clients, employees. By nature, in order to provide a cohesive and integrated service, connected devices need to collect, aggregate, store, mine, process personal and personalized data on individuals and corporations in a variety of contexts and environments. A significant factor in this paradigm shift is the necessity for secure and appropriate transmission, processing and storage of the data. Thus, while benefits of the applications appear to be boundless, these same opportunities are bounded by concerns such as trust, privacy, security, loss of control, and related issues. This poster and presentation look at a multi-factor authentication (MFA) mechanisms that need to change from the login-password tuple to an Identity and Access Management (IAM) model, to the more cohesive to Identity Relationship Management (IRM) standard. It also compares and contrasts messaging protocols that are appropriate for the IoT ecosystem.

Keywords: Internet of Things (IoT), authentication, protocols, survey

Procedia PDF Downloads 276
1536 Fermentation of Pretreated Herbaceous Cellulosic Wastes to Ethanol by Anaerobic Cellulolytic and Saccharolytic Thermophilic Clostridia

Authors: Lali Kutateladze, Tamar Urushadze, Tamar Dudauri, Besarion Metreveli, Nino Zakariashvili, Izolda Khokhashvili, Maya Jobava

Abstract:

Lignocellulosic waste streams from agriculture, paper and wood industry are renewable, plentiful and low-cost raw materials that can be used for large-scale production of liquid and gaseous biofuels. As opposed to prevailing multi-stage biotechnological processes developed for bioconversion of cellulosic substrates to ethanol where high-cost cellulase preparations are used, Consolidated Bioprocessing (CBP) offers to accomplish cellulose and xylan hydrolysis followed by fermentation of both C6 and C5 sugars to ethanol in a single-stage process. Syntrophic microbial consortium comprising of anaerobic, thermophilic, cellulolytic, and saccharolytic bacteria in the genus Clostridia with improved ethanol productivity and high tolerance to fermentation end-products had been proposed for achieving CBP. 65 new strains of anaerobic thermophilic cellulolytic and saccharolytic Clostridia were isolated from different wetlands and hot springs in Georgia. Using new isolates, fermentation of mechanically pretreated wheat straw and corn stalks was done under oxygen-free nitrogen environment in thermophilic conditions (T=550C) and pH 7.1. Process duration was 120 hours. Liquid and gaseous products of fermentation were analyzed on a daily basis using Perkin-Elmer gas chromatographs with flame ionization and thermal detectors. Residual cellulose, xylan, xylose, and glucose were determined using standard methods. Cellulolytic and saccharolytic bacteria strains degraded mechanically pretreated herbaceous cellulosic wastes and fermented glucose and xylose to ethanol, acetic acid and gaseous products like hydrogen and CO2. Specifically, maximum yield of ethanol was reached at 96 h of fermentation and varied between 2.9 – 3.2 g/ 10 g of substrate. The content of acetic acid didn’t exceed 0.35 g/l. Other volatile fatty acids were detected in trace quantities.

Keywords: anaerobic bacteria, cellulosic wastes, Clostridia sp, ethanol

Procedia PDF Downloads 252
1535 Cupric Oxide Thin Films for Optoelectronic Application

Authors: Sanjay Kumar, Dinesh Pathak, Sudhir Saralch

Abstract:

Copper oxide is a semiconductor that has been studied for several reasons such as the natural abundance of starting material copper (Cu); the easiness of production by Cu oxidation; their non-toxic nature and the reasonably good electrical and optical properties. Copper oxide is well-known as cuprite oxide. The cuprite is p-type semiconductors having band gap energy of 1.21 to 1.51 eV. As a p-type semiconductor, conduction arises from the presence of holes in the valence band (VB) due to doping/annealing. CuO is attractive as a selective solar absorber since it has high solar absorbency and a low thermal emittance. CuO is very promising candidate for solar cell applications as it is a suitable material for photovoltaic energy conversion. It has been demonstrated that the dip technique can be used to deposit CuO films in a simple manner using metallic chlorides (CuCl₂.2H₂O) as a starting material. Copper oxide films are prepared using a methanolic solution of cupric chloride (CuCl₂.2H₂O) at three baking temperatures. We made three samples, after heating which converts to black colour. XRD data confirm that the films are of CuO phases at a particular temperature. The optical band gap of the CuO films calculated from optical absorption measurements is 1.90 eV which is quite comparable to the reported value. Dip technique is a very simple and low-cost method, which requires no sophisticated specialized setup. Coating of the substrate with a large surface area can be easily obtained by this technique compared to that in physical evaporation techniques and spray pyrolysis. Another advantage of the dip technique is that it is very easy to coat both sides of the substrate instead of only one and to deposit otherwise inaccessible surfaces. This method is well suited for applying coating on the inner and outer surfaces of tubes of various diameters and shapes. The main advantage of the dip coating method lies in the fact that it is possible to deposit a variety of layers having good homogeneity and mechanical and chemical stability with a very simple setup. In this paper, the CuO thin films preparation by dip coating method and their characterization will be presented.

Keywords: absorber material, cupric oxide, dip coating, thin film

Procedia PDF Downloads 291
1534 Dust Particle Removal from Air in a Self-Priming Submerged Venturi Scrubber

Authors: Manisha Bal, Remya Chinnamma Jose, B.C. Meikap

Abstract:

Dust particles suspended in air are a major source of air pollution. A self-priming submerged venturi scrubber proven very effective in cases of handling nuclear power plant accidents is an efficient device to remove dust particles from the air and thus aids in pollution control. Venturi scrubbers are compact, have a simple mode of operation, no moving parts, easy to install and maintain when compared to other pollution control devices and can handle high temperatures and corrosive and flammable gases and dust particles. In the present paper, fly ash particles recognized as a high air pollutant substance emitted mostly from thermal power plants is considered as the dust particle. Its exposure through skin contact, inhalation and indigestion can lead to health risks and in severe cases can even root to lung cancer. The main focus of this study is on the removal of fly ash particles from polluted air using a self-priming venturi scrubber in submerged conditions using water as the scrubbing liquid. The venturi scrubber comprising of three sections: converging section, throat and diverging section is submerged inside a water tank. The liquid enters the throat due to the pressure difference composed of the hydrostatic pressure of the liquid and static pressure of the gas. The high velocity dust particles atomize the liquid droplets at the throat and this interaction leads to its absorption into water and thus removal of fly ash from the air. Detailed investigation on the scrubbing of fly ash has been done in this literature. Experiments were conducted at different throat gas velocities, water levels and fly ash inlet concentrations to study the fly ash removal efficiency. From the experimental results, the highest fly ash removal efficiency of 99.78% is achieved at the throat gas velocity of 58 m/s, water level of height 0.77m with fly ash inlet concentration of 0.3 x10⁻³ kg/Nm³ in the submerged condition. The effect of throat gas velocity, water level and fly ash inlet concentration on the removal efficiency has also been evaluated. Furthermore, experimental results of removal efficiency are validated with the developed empirical model.

Keywords: dust particles, fly ash, pollution control, self-priming venturi scrubber

Procedia PDF Downloads 135
1533 The Time-Frequency Domain Reflection Method for Aircraft Cable Defects Localization

Authors: Reza Rezaeipour Honarmandzad

Abstract:

This paper introduces an aircraft cable fault detection and location method in light of TFDR keeping in mind the end goal to recognize the intermittent faults adequately and to adapt to the serial and after-connector issues being hard to be distinguished in time domain reflection. In this strategy, the correlation function of reflected and reference signal is used to recognize and find the airplane fault as per the qualities of reflected and reference signal in time-frequency domain, so the hit rate of distinguishing and finding intermittent faults can be enhanced adequately. In the work process, the reflected signal is interfered by the noise and false caution happens frequently, so the threshold de-noising technique in light of wavelet decomposition is used to diminish the noise interference and lessen the shortcoming alert rate. At that point the time-frequency cross connection capacity of the reference signal and the reflected signal based on Wigner-Ville appropriation is figured so as to find the issue position. Finally, LabVIEW is connected to execute operation and control interface, the primary capacity of which is to connect and control MATLAB and LABSQL. Using the solid computing capacity and the bottomless capacity library of MATLAB, the signal processing turn to be effortlessly acknowledged, in addition LabVIEW help the framework to be more dependable and upgraded effectively.

Keywords: aircraft cable, fault location, TFDR, LabVIEW

Procedia PDF Downloads 447
1532 Efficacy of Carvacrol as an Antimicrobial Wash Treatment for Reducing Both Campylobacter jejuni and Aerobic Bacterial Counts on Chicken Skin

Authors: Sandip Shrestha, Ann M. Donoghue, Komala Arsi, Basanta R. Wagle, Abhinav Upadhyay, Dan J. Donoghue

Abstract:

Campylobacter, one of the major cause of foodborne illness worldwide, is commonly present in the intestinal tract of poultry. Many strategies are currently being investigated to reduce Campylobacter counts on commercial poultry during processing with limited success. This study investigated the efficacy of the generally recognized as safe compound, carvacrol (CR), a component of wild oregano oil as a wash treatment for reducing C. jejuni and aerobic bacteria on chicken skin. A total of two trials were conducted, and in each trial, a total of 75 skin samples (4cm × 4cm each) were randomly allocated into 5 treatment groups (0%, 0.25%, 0.5%, 1% and 2% CR). Skin samples were inoculated with a cocktail of four wild strains of C. jejuni (~ 8 log10 CFU/skin). After 30 min of attachment, inoculated skin samples were dipped in the respective treatment solution for 1 min, allowed to drip dry for 2 min and processed at 0, 8, 24 h post treatment for enumeration of C. jejuni and aerobic bacterial counts (n=5/treatment/time point). The data were analyzed by ANOVA using PROC GLM procedure of SAS 9.3. All the tested doses of CR suspension consistently reduced C. jejuni counts across all time points. The 2% CR wash was the most effective treatment and reduced C. jejuni counts by ~4 log₁₀ CFU/sample (P < 0.05). Aerobic counts were reduced for the 0.5% CR dose at 0 and 24h in Trial 1 and at 0, 8 and 24h in Trial 2. The 1 and 2% CR doses consistently reduced aerobic counts in both trials up to 2 log₁₀ CFU/skin.

Keywords: Campylobacter jejuni, carvcrol, chicken skin, postharvest

Procedia PDF Downloads 150
1531 Localized Analysis of Cellulosic Fibrous Insulation Materials

Authors: Chady El Hachem, Pan Ye, Kamilia Abahri, Rachid Bennacer

Abstract:

Considered as a building construction material, and regarding its environmental benefits, wood fiber insulation is the material of interest in this work. The definition of adequate elementary representative volume that guarantees reliable understanding of the hygrothermal macroscopic phenomena is very critical. At the microscopic scale, when subjected to hygric solicitations, fibers undergo local dimensionless variations. It is therefore necessary to master this behavior, which affects the global response of the material. This study consists of an experimental procedure using the non-destructive method, X-ray tomography, followed by morphological post-processing analysis using ImageJ software. A refine investigation took place in order to identify the representative elementary volume and the sufficient resolution for accurate structural analysis. The second part of this work was to evaluate the microscopic hygric behavior of the studied material. Many parameters were taken into consideration, like the evolution of the fiber diameters, distribution along the sorption cycle and the porosity, and the water content evolution. In addition, heat transfer simulations based on the energy equation resolution were achieved on the real structure. Further, the problematic of representative elementary volume was elaborated for such heterogeneous material. Moreover, the material’s porosity and its fibers’ thicknesses show very big correlation with the water content. These results provide the literature with very good understanding of wood fiber insulation’s behavior.

Keywords: hygric behavior, morphological characterization, wood fiber insulation material, x-ray tomography

Procedia PDF Downloads 239
1530 Policy Recommendations for Reducing CO2 Emissions in Kenya's Electricity Generation, 2015-2030

Authors: Paul Kipchumba

Abstract:

Kenya is an East African Country lying at the Equator. It had a population of 46 million in 2015 with an annual growth rate of 2.7%, making a population of at least 65 million in 2030. Kenya’s GDP in 2015 was about 63 billion USD with per capita GDP of about 1400 USD. The rural population is 74%, whereas urban population is 26%. Kenya grapples with not only access to energy but also with energy security. There is direct correlation between economic growth, population growth, and energy consumption. Kenya’s energy composition is at least 74.5% from renewable energy with hydro power and geothermal forming the bulk of it; 68% from wood fuel; 22% from petroleum; 9% from electricity; and 1% from coal and other sources. Wood fuel is used by majority of rural and poor urban population. Electricity is mostly used for lighting. As of March 2015 Kenya had installed electricity capacity of 2295 MW, making a per capital electricity consumption of 0.0499 KW. The overall retail cost of electricity in 2015 was 0.009915 USD/ KWh (KES 19.85/ KWh), for installed capacity over 10MW. The actual demand for electricity in 2015 was 3400 MW and the projected demand in 2030 is 18000 MW. Kenya is working on vision 2030 that aims at making it a prosperous middle income economy and targets 23 GW of generated electricity. However, cost and non-cost factors affect generation and consumption of electricity in Kenya. Kenya does not care more about CO2 emissions than on economic growth. Carbon emissions are most likely to be paid by future costs of carbon emissions and penalties imposed on local generating companies by sheer disregard of international law on C02 emissions and climate change. The study methodology was a simulated application of carbon tax on all carbon emitting sources of electricity generation. It should cost only USD 30/tCO2 tax on all emitting sources of electricity generation to have solar as the only source of electricity generation in Kenya. The country has the best evenly distributed global horizontal irradiation. Solar potential after accounting for technology efficiencies such as 14-16% for solar PV and 15-22% for solar thermal is 143.94 GW. Therefore, the paper recommends adoption of solar power for generating all electricity in Kenya in order to attain zero carbon electricity generation in the country.

Keywords: co2 emissions, cost factors, electricity generation, non-cost factors

Procedia PDF Downloads 339
1529 Wireless Information Transfer Management and Case Study of a Fire Alarm System in a Residential Building

Authors: Mohsen Azarmjoo, Mehdi Mehdizadeh Koupaei, Maryam Mehdizadeh Koupaei, Asghar Mahdlouei Azar

Abstract:

The increasing prevalence of wireless networks in our daily lives has made them indispensable. The aim of this research is to investigate the management of information transfer in wireless networks and the integration of renewable solar energy resources in a residential building. The focus is on the transmission of electricity and information through wireless networks, as well as the utilization of sensors and wireless fire alarm systems. The research employs a descriptive approach to examine the transmission of electricity and information on a wireless network with electric and optical telephone lines. It also investigates the transmission of signals from sensors and wireless fire alarm systems via radio waves. The methodology includes a detailed analysis of security, comfort conditions, and costs related to the utilization of wireless networks and renewable solar energy resources. The study reveals that it is feasible to transmit electricity on a network cable using two pairs of network cables without the need for separate power cabling. Additionally, the integration of renewable solar energy systems in residential buildings can reduce dependence on traditional energy carriers. The use of sensors and wireless remote information processing can enhance the safety and efficiency of energy usage in buildings and the surrounding spaces.

Keywords: renewable energy, intelligentization, wireless sensors, fire alarm system

Procedia PDF Downloads 30
1528 Perceiving Interpersonal Conflict and the Big Five Personality Traits

Authors: Emily Rivera, Toni DiDona

Abstract:

The Big Five personality traits is a hierarchical classification of personality traits that applies factor analysis to a personality survey data in order to describe human personality using five broad dimensions: Extraversion, Agreeableness, Conscientiousness, Neuroticism, and Openness (Fetvadjiev & Van de Vijer, 2015). Research shows that personality constructs underline individual differences in processing conflict and interpersonal relations. (Graziano et al., 1996). This research explores the understudied correlation between the Big Five personality traits and perceived interpersonal conflict in the workplace. It revises social psychological literature on Big Five personality traits within a social context and discusses organizational development journal articles on the perceived efficacy of conflict tactics and approach to interpersonal relationships. The study also presents research undertaken on a survey group of 867 subjects over the age of 18 that were recruited by means of convenience sampling through social media, email, and text messaging. The central finding of this study is that only two of the Big Five personality traits had a significant correlation with perceiving interpersonal conflict in the workplace. Individuals who score higher on agreeableness and neuroticism, perceive more interpersonal conflict in the workplace compared to those that score lower on each dimension. The relationship between both constructs is worthy of research due to its everyday frequency and unique individual psycho-social consequences. This multimethod research associated the Big Five personality dimensions to interpersonal conflict. Its findings that can be utilized to further understand social cognition, person perception, complex social behavior and social relationships in the work environment.

Keywords: five-factor model, interpersonal conflict, personality, The Big Five personality traits

Procedia PDF Downloads 131
1527 Effects of Fermentation Techniques on the Quality of Cocoa Beans

Authors: Monday O. Ale, Adebukola A. Akintade, Olasunbo O. Orungbemi

Abstract:

Fermentation as an important operation in the processing of cocoa beans is now affected by the recent climate change across the globe. The major requirement for effective fermentation is the ability of the material used to retain sufficient heat for the required microbial activities. Apart from the effects of climate on the rate of heat retention, the materials used for fermentation plays an important role. Most Farmers still restrict fermentation activities to the use of traditional methods. Improving on cocoa fermentation in this era of climate change makes it necessary to work on other materials that can be suitable for cocoa fermentation. Therefore, the objective of this study was to determine the effects of fermentation techniques on the quality of cocoa beans. The materials used in this fermentation research were heap-leaves (traditional), stainless steel, plastic tin, plastic basket and wooden box. The period of fermentation varies from zero days to 10 days. Physical and chemical tests were carried out for variables in quality determination in the samples. The weight per bean varied from 1.0-1.2 g after drying across the samples and the major color of the dry beans observed was brown except with the samples from stainless steel. The moisture content varied from 5.5-7%. The mineral content and the heavy metals decreased with increase in the fermentation period. A wooden box can conclusively be used as an alternative to heap-leaves as there was no significant difference in the physical features of the samples fermented with the two methods. The use of a wooden box as an alternative for cocoa fermentation is therefore recommended for cocoa farmers.

Keywords: fermentation, effects, fermentation materials, period, quality

Procedia PDF Downloads 175
1526 Experimental Study on Mechanical Properties of Commercially Pure Copper Processed by Severe Plastic Deformation Technique-Equal Channel Angular Extrusion

Authors: Krishnaiah Arkanti, Ramulu Malothu

Abstract:

The experiments have been conducted to study the mechanical properties of commercially pure copper processing at room temperature by severe plastic deformation using equal channel angular extrusion (ECAE) through a die of 90oangle up to 3 passes by route BC i.e. rotating the sample in the same direction by 90o after each pass. ECAE is used to produce from existing coarse grains to ultra-fine, equiaxed grains structure with high angle grain boundaries in submicron level by introducing a large amount of shear strain in the presence of hydrostatic pressure into the material without changing billet shape or dimension. Mechanical testing plays an important role in evaluating fundamental properties of engineering materials as well as in developing new materials and in controlling the quality of materials for use in design and construction. Yield stress, ultimate tensile stress and ductility are structure sensitive properties and vary with the structure of the material. Microhardness and tensile tests were carried out to evaluate the hardness, strength and ductility of the ECAE processed materials. The results reveal that the strength and hardness of commercially pure copper samples improved significantly without losing much ductility after each pass.

Keywords: equal channel angular extrusion, severe plastic deformation, copper, mechanical properties

Procedia PDF Downloads 156
1525 A Sectional Control Method to Decrease the Accumulated Survey Error of Tunnel Installation Control Network

Authors: Yinggang Guo, Zongchun Li

Abstract:

In order to decrease the accumulated survey error of tunnel installation control network of particle accelerator, a sectional control method is proposed. Firstly, the accumulation rule of positional error with the length of the control network is obtained by simulation calculation according to the shape of the tunnel installation-control-network. Then, the RMS of horizontal positional precision of tunnel backbone control network is taken as the threshold. When the accumulated error is bigger than the threshold, the tunnel installation control network should be divided into subsections reasonably. On each segment, the middle survey station is taken as the datum for independent adjustment calculation. Finally, by taking the backbone control points as faint datums, the weighted partial parameters adjustment is performed with the adjustment results of each segment and the coordinates of backbone control points. The subsections are jointed and unified into the global coordinate system in the adjustment process. An installation control network of the linac with a length of 1.6 km is simulated. The RMS of positional deviation of the proposed method is 2.583 mm, and the RMS of the difference of positional deviation between adjacent points reaches 0.035 mm. Experimental results show that the proposed sectional control method can not only effectively decrease the accumulated survey error but also guarantee the relative positional precision of the installation control network. So it can be applied in the data processing of tunnel installation control networks, especially for large particle accelerators.

Keywords: alignment, tunnel installation control network, accumulated survey error, sectional control method, datum

Procedia PDF Downloads 165
1524 Methylglyoxal Induced Glycoxidation of Human Low Density Lipoprotein: A Biophysical Perspective and Its Role in Diabetes and Periodontitis

Authors: Minhal Abidi, Moinuddin

Abstract:

Diabetes mellitus (DM) induced metabolic abnormalities causes oxidative stress which leads to the pathogenesis of complications associated with diabetes like retinopathy, nephropathy periodontitis etc. Combination of glycation and oxidation 'glycoxidation' occurs when oxidative reactions affect the early state of glycation products. Low density lipoprotein (LDL) is prone to glycoxidative attack by sugars and methylglyoxal (MGO) being a strong glycating agent may have severe impact on its structure and consequent role in diabetes. Pro-inflammatory cytokines like IL1β and TNFα produced by the action of gram negative bacteria in periodontits (PD) can in turn lead to insulin resistance. This work discusses modifications to LDL as a result of glycoxidation. The changes in the protein molecule have been characterized by various physicochemical techniques and the immunogenicity of the modified molecules was also evaluated as they presented neo-epitopes. Binding of antibodies present in diabetes patients to the native and glycated LDL has been evaluated. Role of modified epitopes in the generation of antibodies in diabetes and periodontitis has been discussed. The structural perturbations induced in LDL were analyzed by UV–Vis, fluorescence, circular dichroism and FTIR spectroscopy, molecular docking studies, thermal denaturation studies, Thioflavin T assay, isothermal titration calorimetry, comet assay. MALDI-TOF, ketoamine moieties, carbonyl content and HMF content were also quantitated in native and glycated LDL. IL1β and TNFα levels were also measured in the type 2 DM and PD patients. We report increased carbonyl content, ketoamine moieties and HMF content in glycated LDL as compared to native analogue. The results substantiate that in hyperglycemic state MGO modification of LDL causes structural perturbations making the protein antigenic which could obstruct normal physiological functions and might contribute in the development of secondary complications in diabetic patients like periodontitis.

Keywords: advanced glycation end products, diabetes mellitus, glycation, glycoxidation, low density lipoprotein, periodontitis

Procedia PDF Downloads 171
1523 Artificial Intelligent-Based Approaches for Task ‎Offloading, ‎Resource ‎Allocation and Service ‎Placement of ‎Internet of Things ‎Applications: State of the Art

Authors: Fatima Z. Cherhabil, Mammar Sedrati, Sonia-Sabrina Bendib‎

Abstract:

In order to support the continued growth, critical latency of ‎IoT ‎applications, and ‎various obstacles of traditional data centers, ‎mobile edge ‎computing (MEC) has ‎emerged as a promising solution that extends cloud data-processing and decision-making to edge devices. ‎By adopting a MEC structure, IoT applications could be executed ‎locally, on ‎an edge server, different fog nodes, or distant cloud ‎data centers. However, we are ‎often ‎faced with wanting to optimize conflicting criteria such as ‎minimizing energy ‎consumption of limited local capabilities (in terms of CPU, RAM, storage, bandwidth) of mobile edge ‎devices and trying to ‎keep ‎high performance (reducing ‎response time, increasing throughput and service availability) ‎at the same ‎time‎. Achieving one goal may affect the other, making task offloading (TO), ‎resource allocation (RA), and service placement (SP) complex ‎processes. ‎It is a nontrivial multi-objective optimization ‎problem ‎to study the trade-off between conflicting criteria. ‎The paper provides a survey on different TO, SP, and RA recent multi-‎objective optimization (MOO) approaches used in edge computing environments, particularly artificial intelligent (AI) ones, to satisfy various objectives, constraints, and dynamic conditions related to IoT applications‎.

Keywords: mobile edge computing, multi-objective optimization, artificial ‎intelligence ‎approaches, task offloading, resource allocation, ‎ service placement

Procedia PDF Downloads 86
1522 3D Object Retrieval Based on Similarity Calculation in 3D Computer Aided Design Systems

Authors: Ahmed Fradi

Abstract:

Nowadays, recent technological advances in the acquisition, modeling, and processing of three-dimensional (3D) objects data lead to the creation of models stored in huge databases, which are used in various domains such as computer vision, augmented reality, game industry, medicine, CAD (Computer-aided design), 3D printing etc. On the other hand, the industry is currently benefiting from powerful modeling tools enabling designers to easily and quickly produce 3D models. The great ease of acquisition and modeling of 3D objects make possible to create large 3D models databases, then, it becomes difficult to navigate them. Therefore, the indexing of 3D objects appears as a necessary and promising solution to manage this type of data, to extract model information, retrieve an existing model or calculate similarity between 3D objects. The objective of the proposed research is to develop a framework allowing easy and fast access to 3D objects in a CAD models database with specific indexing algorithm to find objects similar to a reference model. Our main objectives are to study existing methods of similarity calculation of 3D objects (essentially shape-based methods) by specifying the characteristics of each method as well as the difference between them, and then we will propose a new approach for indexing and comparing 3D models, which is suitable for our case study and which is based on some previously studied methods. Our proposed approach is finally illustrated by an implementation, and evaluated in a professional context.

Keywords: CAD, 3D object retrieval, shape based retrieval, similarity calculation

Procedia PDF Downloads 237
1521 The Effect of Alkaline Treatment on Tensile Strength and Morphological Properties of Kenaf Fibres for Yarn Production

Authors: A. Khalina, K. Shaharuddin, M. S. Wahab, M. P. Saiman, H. A. Aisyah

Abstract:

This paper investigates the effect of alkali treatment and mechanical properties of kenaf (Hibiscus cannabinus) fibre for the development of yarn. Two different fibre sources are used for the yarn production. Kenaf fibres were treated with sodium hydroxide (NaOH) in the concentration of 3, 6, 9, and 12% prior to fibre opening process and tested for their tensile strength and Young’s modulus. Then, the selected fibres were introduced to fibre opener at three different opening processing parameters; namely, speed of roller feeder, small drum, and big drum. The diameter size, surface morphology, and fibre durability towards machine of the fibres were characterized. The results show that concentrations of NaOH used have greater effects on fibre mechanical properties. From this study, the tensile and modulus properties of the treated fibres for both types have improved significantly as compared to untreated fibres, especially at the optimum level of 6% NaOH. It is also interesting to highlight that 6% NaOH is the optimum concentration for the alkaline treatment. The untreated and treated fibres at 6% NaOH were then introduced to fibre opener, and it was found that the treated fibre produced higher fibre diameter with better surface morphology compared to the untreated fibre. Higher speed parameter during opening was found to produce higher yield of opened-kenaf fibres.

Keywords: alkaline treatment, kenaf fibre, tensile strength, yarn production

Procedia PDF Downloads 225
1520 Mobile Traffic Management in Congested Cells using Fuzzy Logic

Authors: A. A. Balkhi, G. M. Mir, Javid A. Sheikh

Abstract:

To cater the demands of increasing traffic with new applications the cellular mobile networks face new changes in deployment in infrastructure for making cellular networks heterogeneous. To reduce overhead processing the densely deployed cells require smart behavior with self-organizing capabilities with high adaptation to the neighborhood. We propose self-organization of unused resources usually excessive unused channels of neighbouring cells with densely populated cells to reduce handover failure rates. The neighboring cells share unused channels after fulfilling some conditional candidature criterion using threshold values so that they are not suffered themselves for starvation of channels in case of any abrupt change in traffic pattern. The cells are classified as ‘red’, ‘yellow’, or ‘green’, as per the available channels in cell which is governed by traffic pattern and thresholds. To combat the deficiency of channels in red cell, migration of unused channels from under-loaded cells, hierarchically from the qualified candidate neighboring cells is explored. The resources are returned back when the congested cell is capable of self-contained traffic management. In either of the cases conditional sharing of resources is executed for enhanced traffic management so that User Equipment (UE) is provided uninterrupted services with high Quality of Service (QoS). The fuzzy logic-based simulation results show that the proposed algorithm is efficiently in coincidence with improved successful handoffs.

Keywords: candidate cell, channel sharing, fuzzy logic, handover, small cells

Procedia PDF Downloads 99
1519 Client Hacked Server

Authors: Bagul Abhijeet

Abstract:

Background: Client-Server model is the backbone of today’s internet communication. In which normal user can not have control over particular website or server? By using the same processing model one can have unauthorized access to particular server. In this paper, we discussed about application scenario of hacking for simple website or server consist of unauthorized way to access the server database. This application emerges to autonomously take direct access of simple website or server and retrieve all essential information maintain by administrator. In this system, IP address of server given as input to retrieve user-id and password of server. This leads to breaking administrative security of server and acquires the control of server database. Whereas virus helps to escape from server security by crashing the whole server. Objective: To control malicious attack and preventing all government website, and also find out illegal work to do hackers activity. Results: After implementing different hacking as well as non-hacking techniques, this system hacks simple web sites with normal security credentials. It provides access to server database and allow attacker to perform database operations from client machine. Above Figure shows the experimental result of this application upon different servers and provides satisfactory results as required. Conclusion: In this paper, we have presented a to view to hack the server which include some hacking as well as non-hacking methods. These algorithms and methods provide efficient way to hack server database. By breaking the network security allow to introduce new and better security framework. The terms “Hacking” not only consider for its illegal activities but also it should be use for strengthen our global network.

Keywords: Hacking, Vulnerabilities, Dummy request, Virus, Server monitoring

Procedia PDF Downloads 228
1518 A Systematic Approach to Mitigate the Impact of Increased Temperature and Air Pollution in Urban Settings

Authors: Samain Sabrin, Joshua Pratt, Joshua Bryk, Maryam Karimi

Abstract:

Globally, extreme heat events have led to a surge in the number of heat-related moralities. These incidents are further exacerbated in high-density population centers due to the Urban Heat Island (UHI) effect. Varieties of anthropogenic activities such as unsupervised land surface modifications, expansion of impervious areas, and lack of use of vegetation are all contributors to an increase in the amount of heat flux trapped by an urban canopy which intensifies the UHI effect. This project aims to propose a systematic approach to measure the impact of air quality and increased temperature based on urban morphology in the selected metropolitan cities. This project will measure the impact of build environment for urban and regional planning using human biometeorological evaluations (mean radiant temperature, Tmrt). We utilized the Rayman model (capable of calculating short and long wave radiation fluxes affecting the human body) to estimate the Tmrt in an urban environment incorporating location and height of buildings and trees as a supplemental tool in urban planning, and street design. Our current results suggest a strong correlation between building height and increased surface temperature in megacities. This model will help with; 1. Quantify the impacts of the built environment and surface properties on surrounding temperature, 2. Identify priority urban neighborhoods by analyzing Tmrt and air quality data at pedestrian level, 3. Characterizing the need for urban green infrastructure or better urban planning- maximizing the cooling benefit from existing Urban Green Infrastructure (UGI), and 4. Developing a hierarchy of streets for new UGI integration and propose new UGI based on site characteristics and cooling potential.

Keywords: air quality, heat mitigation, human-biometeorological indices, increased temperature, mean radiant temperature, radiation flux, sustainable development, thermal comfort, urban canopy, urban planning

Procedia PDF Downloads 112
1517 Characterization of Ethanol-Air Combustion in a Constant Volume Combustion Bomb Under Cellularity Conditions

Authors: M. Reyes, R. Sastre, P. Gabana, F. V. Tinaut

Abstract:

In this work, an optical characterization of the ethanol-air laminar combustion is presented in order to investigate the origin of the instabilities developed during the combustion, the onset of the cellular structure and the laminar burning velocity. Experimental tests of ethanol-air have been developed in an optical cylindrical constant volume combustion bomb equipped with a Schlieren technique to record the flame development and the flame front surface wrinkling. With this procedure, it is possible to obtain the flame radius and characterize the time when the instabilities are visible through the cell's apparition and the cellular structure development. Ethanol is an aliphatic alcohol with interesting characteristics to be used as a fuel in Internal Combustion Engines and can be biologically synthesized from biomass. Laminar burning velocity is an important parameter used in simulations to obtain the turbulent flame speed, whereas the flame front structure and the instabilities developed during the combustion are important to understand the transition to turbulent combustion and characterize the increment in the flame propagation speed in premixed flames. The cellular structure is spontaneously generated by volume forces, diffusional-thermal and hydrodynamic instabilities. Many authors have studied the combustion of ethanol air and mixtures of ethanol with other fuels. However, there is a lack of works that investigate the instabilities and the development of a cellular structure in ethanol flames, a few works as characterized the ethanol-air combustion instabilities in spherical flames. In the present work, a parametrical study is made by varying the fuel/air equivalence ratio (0.8-1.4), initial pressure (0.15-0.3 MPa) and initial temperature (343-373K), using a design of experiments type I-optimal. In reach mixtures, it is possible to distinguish the cellular structure formed by the hydrodynamic effect and by from the thermo-diffusive. Results show that ethanol-air flames tend to stabilize as the equivalence ratio decreases in lean mixtures and develop a cellular structure with the increment of initial pressure and temperature.

Keywords: ethanol, instabilities, premixed combustion, schlieren technique, cellularity

Procedia PDF Downloads 45
1516 Music Training as an Innovative Approach to the Treatment of Language Disabilities

Authors: Jonathan Bolduc

Abstract:

Studies have demonstrated the effectiveness of music training approaches to help children with language disabilities. Because music is closely associated with a number of cognitive functions, including language, it has been hypothesized that musical skills transfer to other domains. Research suggests that music training strengthens basic auditory processing skills in dyslexic children and may ameliorate phonological deficits. Furthermore, music instruction has the particular advantage of being non-literacy-based, thus removing the frustrations that can be associated with reading and writing activities among children with specific learning disabilities. In this study, we assessed the effect of implementing an intensive music program on the development of language skills (phonological and reading) in 4- to 9-year-old children. Seventeen children (N=17) participated in the study. The experiment took place over 6 weeks in a controlled environment. Eighteen lessons of 40 minutes were offered during this period by two music specialists. The Dalcroze, Orff, and Kodaly approaches were used. A series of qualitative measures were implemented to document the contribution of music training to this population. Currently, the data is being analyzed. The first results show that learning music seems to significantly improve verbal memory. We already know that language disabilities are considered one of the main causes of school dropout as well as later professional and social failure. We aim to corroborate that an integrated music education program can provide children with language disabilities with the same opportunities to develop and succeed in school as their classmates. Scientifically, the results will contribute to advance the knowledge by identifying the more effective music education strategies to improve the overall development of children worldwide.

Keywords: music education, music, art education, language diasabilities

Procedia PDF Downloads 198
1515 Study of Error Analysis and Sources of Uncertainty in the Measurement of Residual Stresses by the X-Ray Diffraction

Authors: E. T. Carvalho Filho, J. T. N. Medeiros, L. G. Martinez

Abstract:

Residual stresses are self equilibrating in a rigid body that acts on the microstructure of the material without application of an external load. They are elastic stresses and can be induced by mechanical, thermal and chemical processes causing a deformation gradient in the crystal lattice favoring premature failure in mechanicals components. The search for measurements with good reliability has been of great importance for the manufacturing industries. Several methods are able to quantify these stresses according to physical principles and the response of the mechanical behavior of the material. The diffraction X-ray technique is one of the most sensitive techniques for small variations of the crystalline lattice since the X-ray beam interacts with the interplanar distance. Being very sensitive technique is also susceptible to variations in measurements requiring a study of the factors that influence the final result of the measurement. Instrumental, operational factors, form deviations of the samples and geometry of analyzes are some variables that need to be considered and analyzed in order for the true measurement. The aim of this work is to analyze the sources of errors inherent to the residual stress measurement process by X-ray diffraction technique making an interlaboratory comparison to verify the reproducibility of the measurements. In this work, two specimens were machined, differing from each other by the surface finishing: grinding and polishing. Additionally, iron powder with particle size less than 45 µm was selected in order to be a reference (as recommended by ASTM E915 standard) for the tests. To verify the deviations caused by the equipment, those specimens were positioned and with the same analysis condition, seven measurements were carried out at 11Ψ tilts. To verify sample positioning errors, seven measurements were performed by positioning the sample at each measurement. To check geometry errors, measurements were repeated for the geometry and Bragg Brentano parallel beams. In order to verify the reproducibility of the method, the measurements were performed in two different laboratories and equipments. The results were statistically worked out and the quantification of the errors.

Keywords: residual stress, x-ray diffraction, repeatability, reproducibility, error analysis

Procedia PDF Downloads 146
1514 MIMIC: A Multi Input Micro-Influencers Classifier

Authors: Simone Leonardi, Luca Ardito

Abstract:

Micro-influencers are effective elements in the marketing strategies of companies and institutions because of their capability to create an hyper-engaged audience around a specific topic of interest. In recent years, many scientific approaches and commercial tools have handled the task of detecting this type of social media users. These strategies adopt solutions ranging from rule based machine learning models to deep neural networks and graph analysis on text, images, and account information. This work compares the existing solutions and proposes an ensemble method to generalize them with different input data and social media platforms. The deployed solution combines deep learning models on unstructured data with statistical machine learning models on structured data. We retrieve both social media accounts information and multimedia posts on Twitter and Instagram. These data are mapped into feature vectors for an eXtreme Gradient Boosting (XGBoost) classifier. Sixty different topics have been analyzed to build a rule based gold standard dataset and to compare the performances of our approach against baseline classifiers. We prove the effectiveness of our work by comparing the accuracy, precision, recall, and f1 score of our model with different configurations and architectures. We obtained an accuracy of 0.91 with our best performing model.

Keywords: deep learning, gradient boosting, image processing, micro-influencers, NLP, social media

Procedia PDF Downloads 146
1513 Cognitive and Behavioral Disorders in Patients with Precuneal Infarcts

Authors: F. Ece Cetin, H. Nezih Ozdemir, Emre Kumral

Abstract:

Ischemic stroke of the precuneal cortex (PC) alone is extremely rare. This study aims to evaluate the clinical, neurocognitive, and behavioural characteristics of isolated PC infarcts. We assessed neuropsychological and behavioral findings in 12 patients with isolated PC infarct among 3800 patients with ischemic stroke. To determine the most frequently affected brain locus in patients, we first overlapped the ischemic area of patients with specific cognitive disorders and patients without specific cognitive disorders. Secondly, we compared both overlap maps using the 'subtraction plot' function of MRIcroGL. Patients showed various types of cognitive disorders. All patients experienced more than one category of cognitive disorder, except for two patients with only one cognitive disorder. Lesion topographical analysis showed that damage within the anterior precuneal region might lead to consciousness disorders (25%), self-processing impairment (42%), visuospatial disorders (58%), and lesions in the posterior precuneal region caused episodic and semantic memory impairment (33%). The whole precuneus is involved in at least one body awareness disorder. The cause of the stroke was cardioembolism in 5 patients (42%), large artery disease in 3 (25%), and unknown in 4 (33%). This study showed a wide variety of neuropsychological and behavioural disorders in patients with precuneal infarct. Future studies are needed to achieve a proper definition of the function of the precuneus in relation to the extended cortical areas. Precuneal cortex region infarcts have been found to predict a source of embolism from the large arteries or heart.

Keywords: cognition, pericallosal artery, precuneal cortex, ischemic stroke

Procedia PDF Downloads 105
1512 High-Speed Imaging and Acoustic Measurements of Dual-frequency Ultrasonic Processing of Graphite in Water

Authors: Justin Morton, Mohammad Khavari, Abhinav Priyadarshi, Nicole Grobert, Dmitry G. Eskin, Jiawei Mi, Kriakos Porfyrakis, Paul Prentice

Abstract:

Ultrasonic cavitation is used for various processes and applications. Recently, ultrasonic assisted liquid phase exfoliation has been implemented to produce two dimensional nanomaterials. Depending on parameters such as input transducer power and the operational frequency used to induce the cavitation, bubble dynamics can be controlled and optimised. Using ultra-high-speed imagining and acoustic pressure measurements, a dual-frequency systemand its effect on bubble dynamics was investigated. A high frequency transducer (1.174 MHz) showed that bubble fragments and satellite bubbles induced from a low frequency transducer (24 kHz) were able to extend their lifecycle. In addition, this combination of ultrasonic frequencies generated higher acoustic emissions (∼24%) than the sum of the individual transducers. The dual-frequency system also produced an increase in cavitation zone size of∼3 times compared to the low frequency sonotrode. Furthermore, the high frequency induced cavitation bubbleswere shown to rapidly oscillate, although remained stable and did not transiently collapse, even in the presence of a low pressure field. Finally, the spatial distribution of satellite and fragment bubbles from the sonotrode were shown to increase, extending the active cavitation zone. These observations elucidated the benefits of using a dual-frequency system for generating nanomaterials with the aid of ultrasound, in deionised water.

Keywords: dual-frequency, cavitation, bubble dynamics, graphene

Procedia PDF Downloads 169
1511 Neural Changes Associated with Successful Antidepressant Treatment in Adolescents with Major Depressive Disorder

Authors: Dung V. H. Pham, Kathryn Cullen

Abstract:

Introduction: 40% of adolescents with major depression (MDD) are unresponsive to 1st line antidepressant treatment. The neural mechanism underlying treatment-responsive and treatment-resistant depression in adolescent are unclear. Amygdala is important for emotion processing and has been implicated in mood disorders. Past research has shown abnormal amygdala connectivity in adolescents with MDD. This research study changes in amygdala resting-state functional connectivity to find neural correlates of successful antidepressant treatment. Methods: Thirteen adolescents aged 12-19 underwent rfMRI before and after 8-week antidepressant treatment and completed BDI-II at each scan. A whole-brain approach, using anatomically defined amygdala ROIs (1) identified brain regions that are highly synchronous with the amygdala, (2) correlated neural changes with changes in overall depression and specific symptom clusters within depression. Results: Some neural correlates were common across domains: (1) decreased amygdala RSFC with the default mode network (posterior cingulate, precuneus) is associated with improvement in overall depression and many symptom clusters, (2) increased amygdala RSFC with fusiform gyrus is associated with symptom improvement across many symptom clusters. We also found unique neural changes associated with symptom improvement in each symptom cluster. Conclusion: This is the first preliminary study that looks at neural correlates of antidepressant treatment response to overall depression as well as different clusters of symptoms of depression. The finding suggests both overlapping and distinct neural mechanisms underlying improvement in each symptom clusters within depression. Some brain regions found are also implicated in MDD among adults in previous literature.

Keywords: depression, adolescents, fMRI, antidepressants

Procedia PDF Downloads 235