Search results for: primal-dual interior point method
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 22387

Search results for: primal-dual interior point method

12907 A Novel Approach of Secret Communication Using Douglas-Peucker Algorithm

Authors: R. Kiruthika, A. Kannan

Abstract:

Steganography is the problem of hiding secret messages in 'innocent – looking' public communication so that the presence of the secret message cannot be detected. This paper introduces a steganographic security in terms of computational in-distinguishability from a channel of probability distributions on cover messages. This method first splits the cover image into two separate blocks using Douglas – Peucker algorithm. The text message and the image will be hided in the Least Significant Bit (LSB) of the cover image.

Keywords: steganography, lsb, embedding, Douglas-Peucker algorithm

Procedia PDF Downloads 333
12906 An Approach to Wind Turbine Modeling for Increasing Its Efficiency

Authors: Rishikesh Dingari, Sai Kiran Dornala

Abstract:

In this paper, a simple method of achieving maximum power by mechanical energy transmission device (METD) with integration to induction generator is proposed. METD functioning is explained and dynamic response of system to step input is plotted. Induction generator is being operated at self-excited mode with excitation capacitor at stator. Voltage and current are observed when linked to METD.

Keywords: mechanical energy transmitting device(METD), self-excited induction generator, wind turbine, hydraulic actuators

Procedia PDF Downloads 318
12905 Preferred Service Delivery options for Female Sex Workers in the Riverine Area of lome, Togo

Authors: Gbone Akou Sophie

Abstract:

Lome state in Togo is considered to have the highest HIV prevalence in Togo according to NAIIS 2023, with the prevalence of 5.5%, Female Sex Workers (FSW) are one of the most vulnerable population, and they are vital in HIV programming. They have the highest HIV prevalence compared to others such as HRM, PWID and Transgender in lome State, Togo. Evidence from Integrated Biological Behavioral Surveillance Survey shows increasing burden of HIV infection from 13.7% in 20018 to 17.2% in 2020 and now 22.9% in 2021 among Female Sex Workers (FSW). This shows their HIV prevalence has been rising over time. The vulnerability status of the FSW in the riverine areas of lome is heightened because of cultural and economic issues where there is exchange of sex for commodities with cross border traders as well as limited access to HIV prevention information. Methods:A cross sectional study which recruited 120 FSW from two Riverine LGAs of Agoe and Kpehenou LGA of Lome State using both snowballing and simple random sampling technique. While semi-structured questionnaire was used as an instrument for data collection among the 120 FSW respondents. Additional information was also elicited from 10 FSW key opinion leaders and community members through in-depth interviews (IDI). Results: 44(36%) of respondents were willing to receive regular HIV care and services as well as visit for STI check-ups at any service point. However, 47(40%) were willing to receive services at private facilities alone, 10 (8%) were willing to receive services at public facilities, 6 (5%) were willing to access services in their homes rather than in the health facility. 13 (11%) were also willing to have peers assist in getting HIV testing services. Conclusion: integrated differentiated model of care for HIV services helps improve HIV services uptake among FSW community especially in the hard- to reach riverine areas which will further lead to epidemic control. Also targeted HIV information should be designed to suit the learning needs of the hard-to reach communities like the riverine areas. More peer educators should be engaged to ensure information and other HIV services reach the riverine communities.

Keywords: female sex workers ( FSW), human immuno-deficiency virus(HIV), prevanlence, service delivery

Procedia PDF Downloads 35
12904 A Novel Probablistic Strategy for Modeling Photovoltaic Based Distributed Generators

Authors: Engy A. Mohamed, Y. G. Hegazy

Abstract:

This paper presents a novel algorithm for modeling photovoltaic based distributed generators for the purpose of optimal planning of distribution networks. The proposed algorithm utilizes sequential Monte Carlo method in order to accurately consider the stochastic nature of photovoltaic based distributed generators. The proposed algorithm is implemented in MATLAB environment and the results obtained are presented and discussed.

Keywords: comulative distribution function, distributed generation, Monte Carlo

Procedia PDF Downloads 556
12903 Lineup Optimization Model of Basketball Players Based on the Prediction of Recursive Neural Networks

Authors: Wang Yichen, Haruka Yamashita

Abstract:

In recent years, in the field of sports, decision making such as member in the game and strategy of the game based on then analysis of the accumulated sports data are widely attempted. In fact, in the NBA basketball league where the world's highest level players gather, to win the games, teams analyze the data using various statistical techniques. However, it is difficult to analyze the game data for each play such as the ball tracking or motion of the players in the game, because the situation of the game changes rapidly, and the structure of the data should be complicated. Therefore, it is considered that the analysis method for real time game play data is proposed. In this research, we propose an analytical model for "determining the optimal lineup composition" using the real time play data, which is considered to be difficult for all coaches. In this study, because replacing the entire lineup is too complicated, and the actual question for the replacement of players is "whether or not the lineup should be changed", and “whether or not Small Ball lineup is adopted”. Therefore, we propose an analytical model for the optimal player selection problem based on Small Ball lineups. In basketball, we can accumulate scoring data for each play, which indicates a player's contribution to the game, and the scoring data can be considered as a time series data. In order to compare the importance of players in different situations and lineups, we combine RNN (Recurrent Neural Network) model, which can analyze time series data, and NN (Neural Network) model, which can analyze the situation on the field, to build the prediction model of score. This model is capable to identify the current optimal lineup for different situations. In this research, we collected all the data of accumulated data of NBA from 2019-2020. Then we apply the method to the actual basketball play data to verify the reliability of the proposed model.

Keywords: recurrent neural network, players lineup, basketball data, decision making model

Procedia PDF Downloads 107
12902 Highway Waste Management in Zambia Policy Preparedness and Remedies: The Case of Great East Road

Authors: Floyd Misheck Mwanza, Paul Boniface Majura

Abstract:

The paper looked at highways/ roadside waste generation, disposal and the consequent environmental impacts. The dramatic increase in vehicular and paved roads in the recent past in Zambia, has given rise to the indiscriminate disposal of litter that now poses a threat to health and the environment. Primary data was generated by carrying out oral interviews and field observations for holistic and in–depth assessment of the environment and the secondary data was obtained from desk review method, information on effects of roadside wastes on environment were obtained from relevant literatures. The interviews were semi structured and a purposive sampling method was adopted and analyzed descriptively. The results of the findings showed that population growth and unplanned road expansion has exceeded the expected limit in recent time with resultant poor system of roadside wastes disposal. Roadside wastes which contain both biodegradable and non-biodegradable roadside wastes are disposed at the shoulders of major highways in temporary dumpsites and are never collected by a road development agency (RDA). There is no organized highway to highway or street to street collection of the wastes in Zambia by the key organization the RDA. The study revealed that roadside disposal of roadside wastes has serious impacts on the environment. Some of these impacts include physical nuisance of the wastes to the environment, the waste dumps also serve as hideouts for rodents and snakes which are dangerous. Waste are blown around by wind making the environment filthy, most of the wastes are also been washed by overland flow during heavy downpour to block drainage channels and subsequently lead to flooding of the environment. Most of the non- biodegradable wastes contain toxic chemicals which have serious implications on the environmental sustainability and human health. The paper therefore recommends that Government/ RDA should come up with proper orientation and environmental laws should be put in place for the general public and also to provide necessary facilities and arrange for better methods of collection of wastes.

Keywords: biodegradable, disposal, environment, impacts

Procedia PDF Downloads 315
12901 Gene Expressions in Left Ventricle Heart Tissue of Rat after 150 Mev Proton Irradiation

Authors: R. Fardid, R. Coppes

Abstract:

Introduction: In mediastinal radiotherapy and to a lesser extend also in total-body irradiation (TBI) radiation exposure may lead to development of cardiac diseases. Radiation-induced heart disease is dose-dependent and it is characterized by a loss of cardiac function, associated with progressive heart cells degeneration. We aimed to determine the in-vivo radiation effects on fibronectin, ColaA1, ColaA2, galectin and TGFb1 gene expression levels in left ventricle heart tissues of rats after irradiation. Material and method: Four non-treatment adult Wistar rats as control group (group A) were selected. In group B, 4 adult Wistar rats irradiated to 20 Gy single dose of 150 Mev proton beam locally in heart only. In heart plus lung irradiate group (group C) 4 adult rats was irradiated by 50% of lung laterally plus heart radiation that mentioned in before group. At 8 weeks after radiation animals sacrificed and left ventricle heart dropped in liquid nitrogen for RNA extraction by Absolutely RNA® Miniprep Kit (Stratagen, Cat no. 400800). cDNA was synthesized using M-MLV reverse transcriptase (Life Technologies, Cat no. 28025-013). We used Bio-Rad machine (Bio Rad iQ5 Real Time PCR) for QPCR testing by relative standard curve method. Results: We found that gene expression of fibronectin in group C significantly increased compared to control group, but it was not showed significant change in group B compared to group A. The levels of gene expressions of Cola1 and Cola2 in mRNA did not show any significant changes between normal and radiation groups. Changes of expression of galectin target significantly increased only in group C compared to group A. TGFb1 expressions in group C more than group B showed significant enhancement compared to group A. Conclusion: In summary we can say that 20 Gy of proton exposure of heart tissue may lead to detectable damages in heart cells and may distribute function of them as a component of heart tissue structure in molecular level.

Keywords: gene expression, heart damage, proton irradiation, radiotherapy

Procedia PDF Downloads 462
12900 Otherness of Roma in Inclusive Education of Roma Pupils in Slovakia

Authors: Bibiana Hlebova

Abstract:

The Slovak Republic is a democratic and plural society consisting of people differing in language and culture, and its citizens should already be well prepared for the coexistence of multiple nations, nationalities or ethnic groups. Reflection on culture, art and literature of the Roma minority has taken on a new dimension in Slovakia in the past two decades when it comes to social, cultural and arts integration of this ethnic group with the plural society. Non-Roma view Roma as a specific ethnic group with their own culture, language, customs and traditions, social norms and coexistence that has retained archetypal qualities of Roma identity (romipen) in their real lives as well as in the literary world. Roma characters in works of art are specific and distinguishable from other literary characters simply by being Roma, that is, of a different origin and social status, they represent a different way of life, a distinctive hierarchy of values. The portrayal of Roma and the life of Roma ethnic group in the most dominant genre of Roma literature for children and youth, a Roma fairy tale (paramisi), can work as a suitable means to learn about, accept and tolerate the otherness of Roma in the conditions of school inclusion of students coming from the Roma ethnic group, and to support their identification with their own ethnic group and its cultural traditions. The paper aims to point out not only the specific nature of Roma identity (romipen) through the selected Roma fairy tale (paramisa) – Children of the Sun, but also the diversity of its uses in the educational process within primary education of pupils at elementary schools, advocating the philosophy of inclusive education. Through the suggestions of multi-cultural, emotional, and language and communication education of pupils through the work with the selected Roma fairy tale (paramisa), the author is exploring ways to overcome the issues stemming from the coexistence of Roma and Non-Roma pupils, which are burdened with prejudice, intolerance, aggression and racism on both sides, in the education process.

Keywords: inclusive education, otherness, Roma, Roma fairy tale, Roma identity

Procedia PDF Downloads 275
12899 A Simple Chemical Precipitation Method of Titanium Dioxide Nanoparticles Using Polyvinyl Pyrrolidone as a Capping Agent and Their Characterization

Authors: V. P. Muhamed Shajudheen, K. Viswanathan, K. Anitha Rani, A. Uma Maheswari, S. Saravana Kumar

Abstract:

In this paper, a simple chemical precipitation route for the preparation of titanium dioxide nanoparticles, synthesized by using titanium tetra isopropoxide as a precursor and polyvinyl pyrrolidone (PVP) as a capping agent, is reported. The Differential Scanning Calorimetry (DSC) and Thermo Gravimetric Analysis (TGA) of the samples were recorded and the phase transformation temperature of titanium hydroxide, Ti(OH)4 to titanium oxide, TiO2 was investigated. The as-prepared Ti(OH)4 precipitate was annealed at 800°C to obtain TiO2 nanoparticles. The thermal, structural, morphological and textural characterizations of the TiO2 nanoparticle samples were carried out by different techniques such as DSC-TGA, X-Ray Diffraction (XRD), Fourier Transform Infra-Red spectroscopy (FTIR), Micro Raman spectroscopy, UV-Visible absorption spectroscopy (UV-Vis), Photoluminescence spectroscopy (PL) and Field Effect Scanning Electron Microscopy (FESEM) techniques. The as-prepared precipitate was characterized using DSC-TGA and confirmed the mass loss of around 30%. XRD results exhibited no diffraction peaks attributable to anatase phase, for the reaction products, after the solvent removal. The results indicate that the product is purely rutile. The vibrational frequencies of two main absorption bands of prepared samples are discussed from the results of the FTIR analysis. The formation of nanosphere of diameter of the order of 10 nm, has been confirmed by FESEM. The optical band gap was found by using UV-Visible spectrum. From photoluminescence spectra, a strong emission was observed. The obtained results suggest that this method provides a simple, efficient and versatile technique for preparing TiO2 nanoparticles and it has the potential to be applied to other systems for photocatalytic activity.

Keywords: TiO2 nanoparticles, chemical precipitation route, phase transition, Fourier Transform Infra-Red spectroscopy (FTIR), micro-Raman spectroscopy, UV-Visible absorption spectroscopy (UV-Vis), Photoluminescence Spectroscopy (PL) and Field Effect Scanning electron microscopy (FESEM)

Procedia PDF Downloads 297
12898 Diagnosis of Choledocholithiasis with Endosonography

Authors: A. Kachmazova, A. Shadiev, Y. Teterin, P. Yartcev

Abstract:

Introduction: Biliary calculi disease (LCS) still occupies the leading position among urgent diseases of the abdominal cavity, manifesting itself from asymptomatic course to life-threatening states. Nowadays arsenal of diagnostic methods for choledocholithiasis is quite wide: ultrasound, hepatobiliscintigraphy (HBSG), magnetic resonance imaging (MRI), endoscopic retrograde cholangiography (ERCP). Among them, transabdominal ultrasound (TA ultrasound) is the most accessible and routine diagnostic method. Nowadays ERCG is the "gold" standard in diagnosis and one-stage treatment of biliary tract obstruction. However, transpapillary techniques are accompanied by serious postoperative complications (postmanipulative pancreatitis (3-5%), endoscopic papillosphincterotomy bleeding (2%), cholangitis (1%)), the lethality being 0.4%. GBSG and MRI are also quite informative methods in the diagnosis of choledocholithiasis. Small size of concrements, their localization in intrapancreatic and retroduodenal part of common bile duct significantly reduces informativity of all diagnostic methods described above, that demands additional studying of this problem. Materials and Methods: 890 patients with the diagnosis of cholelithiasis (calculous cholecystitis) were admitted to the Sklifosovsky Scientific Research Institute of Hospital Medicine in the period from August, 2020 to June, 2021. Of them 115 people with mechanical jaundice caused by concrements in bile ducts. Results: Final EUS diagnosis was made in all patients (100,0%). In all patients in whom choledocholithiasis diagnosis was revealed or confirmed after EUS, ERCP was performed urgently (within two days from the moment of its detection) as the X-ray operation room was provided; it confirmed the presence of concrements. All stones were removed by lithoextraction using Dormia basket. The postoperative period in these patients had no complications. Conclusions: EUS is the most informative and safe diagnostic method, which allows to detect choledocholithiasis in patients with discrepancies between clinical-laboratory and instrumental methods of diagnosis in shortest time, that in its turn will help to decide promptly on the further tactics of patient treatment. We consider it reasonable to include EUS in the diagnostic algorithm for choledocholithiasis. Disclosure: Nothing to disclose.

Keywords: endoscopic ultrasonography, choledocholithiasis, common bile duct, concrement, ERCP

Procedia PDF Downloads 64
12897 Application of Electrochromic Glazing for Reducing Peak Cooling Loads

Authors: Ranojoy Dutta

Abstract:

HVAC equipment capacity has a direct impact on occupant comfort and energy consumption of a building. Glazing gains, especially in buildings with high window area, can be a significant contributor to the total peak load on the HVAC system, leading to over-sized systems that mostly operate at poor part load efficiency. In addition, radiant temperature, which largely drives occupant comfort in glazed perimeter zones, is often not effectively controlled despite the HVAC being designed to meet the air temperature set-point. This is due to short wave solar radiation transmitted through windows, that is not sensed by the thermostat until much later when the thermal mass in the room releases the absorbed solar heat to the indoor air. The implication of this phenomenon is increased cooling energy despite poor occupant comfort. EC glazing can significantly eliminate direct solar transmission through windows, reducing both the space cooling loads for the building and improving comfort for occupants near glazing. This paper will review the exact mechanism of how EC glazing would reduce the peak load under design day conditions, leading to reduced cooling capacity vs regular high-performance glazing. Since glazing heat transfer only affects the sensible load, system sizing will be evaluated both with and without the availability of a DOAS to isolate the downsizing potential of the primary cooling equipment when outdoor air is conditioned separately. Given the dynamic nature of glazing gains due to the sun’s movement, effective peak load mitigation with EC requires an automated control system that can predict solar movement and radiation levels so that the right tint state with the appropriate SHGC is utilized at any given time for a given façade orientation. Such an automated EC product will be evaluated for a prototype commercial office model situated in four distinct climate zones.

Keywords: electrochromic glazing, peak sizing, thermal comfort, glazing load

Procedia PDF Downloads 109
12896 The Effect of MOOC-Based Distance Education in Academic Engagement and Its Components on Kerman University Students

Authors: Fariba Dortaj, Reza Asadinejad, Akram Dortaj, Atena Baziyar

Abstract:

The aim of this study was to determine the effect of distance education (based on MOOC) on the components of academic engagement of Kerman PNU. The research was quasi-experimental method that cluster sampling with an appropriate volume was used in this study (one class in experimental group and one class in controlling group). Sampling method is single-stage cluster sampling. The statistical society is students of Kerman Payam Noor University, which) were selected 40 of them as sample (20 students in the control group and 20 students in experimental group). To test the hypothesis, it was used the analysis of univariate and Co-covariance to offset the initial difference (difference of control) in the experimental group and the control group. The instrument used in this study is academic engagement questionnaire of Zerang (2012) that contains component of cognitive, behavioral and motivational engagement. The results showed that there is no significant difference between mean scores of academic components of academic engagement in experimental group and the control group on the post-test, after elimination of the pre-test. The adjusted mean scores of components of academic engagement in the experimental group were higher than the adjusted average of scores after the test in the control group. The use of technology-based education in distance education has been effective in increasing cognitive engagement, motivational engagement and behavioral engagement among students. Experimental variable with the effect size 0.26, predicted 26% of cognitive engagement component variance. Experimental variable with the effect size 0.47, predicted 47% of the motivational engagement component variance. Experimental variable with the effect size 0.40, predicted 40% of behavioral engagement component variance. So teaching with technology (MOOC) has a positive impact on increasing academic engagement and academic performance of students in educational technology. The results suggest that technology (MOOC) is used to enrich the teaching of other lessons of PNU.

Keywords: educational technology, distance education, components of academic engagement, mooc technology

Procedia PDF Downloads 119
12895 Derivation of Bathymetry Data Using Worldview-2 Multispectral Images in Shallow, Turbid and Saline Lake Acıgöl

Authors: Muhittin Karaman, Murat Budakoglu

Abstract:

In this study, derivation of lake bathymetry was evaluated using the high resolution Worldview-2 multispectral images in the very shallow hypersaline Lake Acıgöl which does not have a stable water table due to the wet-dry season changes and industrial usage. Every year, a great part of the lake water budget has been consumed for the industrial salt production in the evaporation ponds, which are generally located on the south and north shores of Lake Acıgöl. Therefore, determination of the water level changes from a perspective of remote sensing-based lake water by bathymetry studies has a great importance in the sustainability-control of the lake. While the water table interval is around 1 meter between dry and wet season, dissolved ion concentration, salinity and turbidity also show clear differences during these two distinct seasonal periods. At the same time, with the satellite data acquisition (June 9, 2013), a field study was conducted to collect the salinity values, Secchi disk depths and turbidity levels. Max depth, Secchi disk depth and salinity were determined as 1,7 m, 0,9 m and 43,11 ppt, respectively. Eight-band Worldview-2 image was corrected for atmospheric effects by ATCOR technique. For each sampling point in the image, mean reflectance values in 1*1, 3*3, 5*5, 7*7, 9*9, 11*11, 13*13, 15*15, 17*17, 19*19, 21*21, 51*51 pixel reflectance neighborhoods were calculated separately. A unique image has been derivated for each matrix resolution. Spectral values and depth relation were evaluated for these distinct resolution images. Correlation coefficients were determined for the 1x1 matrix: 0,98, 0,96, 0,95 and 0,90 for the 724 nm, 831 nm, 908 nm and 659 nm, respectively. While 15x5 matrix characteristics with 0,98, 0,97 and 0,97 correlation values for the 724 nm, 908 nm and 831 nm, respectively; 51x51 matrix shows 0,98, 0,97 and 0,96 correlation values for the 724 nm, 831 nm and 659 nm, respectively. Comparison of all matrix resolutions indicates that RedEdge band (724 nm) of the Worldview-2 satellite image has the best correlation with the saline shallow lake of Acıgöl in-situ depth.

Keywords: bathymetry, Worldview-2 satellite image, ATCOR technique, Lake Acıgöl, Denizli, Turkey

Procedia PDF Downloads 409
12894 Wireless FPGA-Based Motion Controller Design by Implementing 3-Axis Linear Trajectory

Authors: Kiana Zeighami, Morteza Ozlati Moghadam

Abstract:

Designing a high accuracy and high precision motion controller is one of the important issues in today’s industry. There are effective solutions available in the industry but the real-time performance, smoothness and accuracy of the movement can be further improved. This paper discusses a complete solution to carry out the movement of three stepper motors in three dimensions. The objective is to provide a method to design a fully integrated System-on-Chip (SOC)-based motion controller to reduce the cost and complexity of production by incorporating Field Programmable Gate Array (FPGA) into the design. In the proposed method the FPGA receives its commands from a host computer via wireless internet communication and calculates the motion trajectory for three axes. A profile generator module is designed to realize the interpolation algorithm by translating the position data to the real-time pulses. This paper discusses an approach to implement the linear interpolation algorithm, since it is one of the fundamentals of robots’ movements and it is highly applicable in motion control industries. Along with full profile trajectory, the triangular drive is implemented to eliminate the existence of error at small distances. To integrate the parallelism and real-time performance of FPGA with the power of Central Processing Unit (CPU) in executing complex and sequential algorithms, the NIOS II soft-core processor was added into the design. This paper presents different operating modes such as absolute, relative positioning, reset and velocity modes to fulfill the user requirements. The proposed approach was evaluated by designing a custom-made FPGA board along with a mechanical structure. As a result, a precise and smooth movement of stepper motors was observed which proved the effectiveness of this approach.

Keywords: 3-axis linear interpolation, FPGA, motion controller, micro-stepping

Procedia PDF Downloads 188
12893 Effect of Al Addition on Microstructure and Properties of NbTiZrCrAl Refractory High Entropy Alloys

Authors: Xiping Guo, Fanglin Ge, Ping Guan

Abstract:

Refractory high entropy alloys are alternative materials expected to be employed at high temperatures. The comprehensive changes of microstructure and properties of NbTiZrCrAl refractory high entropy alloys are systematically studied by adjusting Al content. Five kinds of button alloy ingots with different contents of Al in NbTiZrCrAlX (X=0, 0.2, 0.5, 0.75, 1.0) were prepared by vacuum non-consumable arc melting technology. The microstructure analysis results show that the five alloys are composed of BCC solid solution phase rich in Nb and Ti and Laves phase rich in Cr, Zr, and Al. The addition of Al changes the structure from hypoeutectic to hypereutectic, increases the proportion of Laves phase, and changes the structure from cubic C15 to hexagonal C14. The hardness and fracture toughness of the five alloys were tested at room temperature, and the compressive mechanical properties were tested at 1000℃. The results showed that the addition of Al increased the proportion of Laves phase and decreased the proportion of the BCC phase, thus increasing the hardness and decreasing the fracture toughness at room temperature. However, at 1000℃, the strength of 0.5Al and 0.75Al alloys whose composition is close to the eutectic point is the best, which indicates that the eutectic structure is of great significance for the improvement of high temperature strength of NbTiZrCrAl refractory high entropy alloys. The five alloys were oxidized for 1 h and 20 h in static air at 1000℃. The results show that only the oxide film of 0Al alloy falls off after oxidizing for 1 h at 1000℃. After 20h, the oxide film of all the alloys fell off, but the oxide film of alloys containing Al was more dense and complete. By producing protective oxide Al₂O₃, inhibiting the preferential oxidation of Zr, promoting the preferential oxidation of Ti, and combination of Cr₂O₃ and Nb₂O₅ to form CrNbO₄, Al significantly improves the high temperature oxidation resistance of NbTiZrCrAl refractory high entropy alloys.

Keywords: NbTiZrCrAl, refractory high entropy alloy, al content, microstructural evolution, room temperature mechanical properties, high temperature compressive strength, oxidation resistance

Procedia PDF Downloads 63
12892 Regional Anesthesia: A Vantage Point for Management of Normal Pressure Hydrocephalus

Authors: Kunal K. S., Shwetashri K. R., Keerthan G., Ajinkya R.

Abstract:

Background: Normal pressure hydrocephalus is a condition caused by abnormal accumulation of cerebrospinal fluid (CSF) within the brain resulting in enlarged cerebral ventricles due to a disruption of CSF formation, absorption, or flow. Over the course of time, ventriculoperitoneal shunt under general anesthesia has become a standard of care. Yet only a finite number of centers have started the inclusion of regional anesthesia techniques for the such patient cohort. Stem Case: We report a case of a 75-year-old male with underlying aortic sclerosis and cardiomyopathy who presented with complaints of confusion, forgetfulness, and difficulty in walking. Neuro-imaging studies revealed disproportionally enlarged subarachnoid space hydrocephalus (DESH). The baseline blood pressure was 116/67 mmHg with a heart rate of 106 beats/min and SpO2 of 96% on room air. The patient underwent smooth induction followed by sonographically guided superficial cervical plexus block and transverse abdominis plane block. Intraoperative pain indices were monitored with Analgesia nociceptive index monitor (ANI, MdolorisTM) and surgical plethysmographic index (SPI, GE Healthcare, Helsinki, FinlandTM). These remained stable during the application of the block and the entire surgical duration. No significant hemodynamic response was observed during the tunneling of the skin by the surgeon. The patient underwent a smooth recovery and emergence. Conclusion: Our decision to incorporate peripheral nerve blockade in conjunction with general anesthesia resulted in opioid-sparing anesthesia and decreased post-operative analgesic requirement by the patient. This blockade was successful in suppressing intraoperative stress responses. Our patient recovered adequately and underwent an uncomplicated post-operative stay.

Keywords: desh, NPH, VP shunt, cervical plexus block, transversus abdominis plane block

Procedia PDF Downloads 53
12891 Using the Bootstrap for Problems Statistics

Authors: Brahim Boukabcha, Amar Rebbouh

Abstract:

The bootstrap method based on the idea of exploiting all the information provided by the initial sample, allows us to study the properties of estimators. In this article we will present a theoretical study on the different methods of bootstrapping and using the technique of re-sampling in statistics inference to calculate the standard error of means of an estimator and determining a confidence interval for an estimated parameter. We apply these methods tested in the regression models and Pareto model, giving the best approximations.

Keywords: bootstrap, error standard, bias, jackknife, mean, median, variance, confidence interval, regression models

Procedia PDF Downloads 359
12890 Electrochemical Inactivation of Toxic Cyanobacteria and Degradation of Cyanotoxins

Authors: Belal Bakheet, John Beardall, Xiwang Zhang, David McCarthy

Abstract:

The potential risks associated with toxic cyanobacteria have raised growing environmental and public health concerns leading to an increasing effort into researching ways to bring about their removal from water, together with destruction of their associated cyanotoxins. A variety of toxins are synthesized by cyanobacteria and include hepatotoxins, neurotoxins, and cytotoxins which can cause a range of symptoms in humans from skin irritation to serious liver and nerve damage. Therefore drinking water treatment processes should ensure the consumers’ safety by removing both cyanobacterial cells, and cyanotoxins from the water. Cyanobacterial cells and cyanotoxins presented challenges to the conventional water treatment systems; their accumulation within drinking water treatment plants has been reported leading to plants shut down. Thus, innovative and effective water purification systems to tackle cyanobacterial pollution are required. In recent years there has been increasing attention to the electrochemical oxidation process as a feasible alternative disinfection method which is able to generate in situ a variety of oxidants that would achieve synergistic effects in the water disinfection process and toxin degradation. By utilizing only electric current, the electrochemical process through electrolysis can produce reactive oxygen species such as hydroxyl radicals from the water, or other oxidants such as chlorine from chloride ions present in the water. From extensive physiological and morphological investigation of cyanobacterial cells during electrolysis, our results show that these oxidants have significant impact on cell inactivation, simultaneously with cyanotoxins removal without the need for chemicals addition. Our research aimed to optimize existing electrochemical oxidation systems and develop new systems to treat water containing toxic cyanobacteria and cyanotoxins. The research covers detailed mechanism study on oxidants production and cell inactivation in the treatment under environmental conditions. Overall, our study suggests that the electrochemical treatment process e is an effective method for removal of toxic cyanobacteria and cyanotoxins.

Keywords: toxic cyanobacteria, cyanotoxins, electrochemical process, oxidants

Procedia PDF Downloads 207
12889 Analyzing the Emergence of Conscious Phenomena by the Process-Based Metaphysics

Authors: Chia-Lin Tu

Abstract:

Towards the end of the 20th century, a reductive picture has dominated in philosophy of science and philosophy of mind. Reductive physicalism claims that all entities and properties in this world are eventually able to be reduced to the physical level. It means that all phenomena in the world are able to be explained by laws of physics. However, quantum physics provides another picture. It says that the world is undergoing change and the energy of change is, in fact, the most important part to constitute world phenomena. Quantum physics provides us another point of view to reconsider the reality of the world. Throughout the history of philosophy of mind, reductive physicalism tries to reduce the conscious phenomena to physical particles as well, meaning that the reality of consciousness is composed by physical particles. However, reductive physicalism is unable to explain conscious phenomena and mind-body causation. Conscious phenomena, e.g., qualia, is not composed by physical particles. The current popular theory for consciousness is emergentism. Emergentism is an ambiguous concept which has not had clear idea of how conscious phenomena are emerged by physical particles. In order to understand the emergence of conscious phenomena, it seems that quantum physics is an appropriate analogy. Quantum physics claims that physical particles and processes together construct the most fundamental field of world phenomena, and thus all natural processes, i.e., wave functions, have occurred within. The traditional space-time description of classical physics is overtaken by the wave-function story. If this methodology of quantum physics works well to explain world phenomena, then it is not necessary to describe the world by the idea of physical particles like classical physics did. Conscious phenomena are one kind of world phenomena. Scientists and philosophers have tried to explain the reality of them, but it has not come out any conclusion. Quantum physics tells us that the fundamental field of the natural world is processed metaphysics. The emergence of conscious phenomena is only possible within this process metaphysics and has clearly occurred. By the framework of quantum physics, we are able to take emergence more seriously, and thus we can account for such emergent phenomena as consciousness. By questioning the particle-mechanistic concept of the world, the new metaphysics offers an opportunity to reconsider the reality of conscious phenomena.

Keywords: quantum physics, reduction, emergence, qualia

Procedia PDF Downloads 135
12888 An Approach for Ensuring Data Flow in Freight Delivery and Management Systems

Authors: Aurelija Burinskienė, Dalė Dzemydienė, Arūnas Miliauskas

Abstract:

This research aims at developing the approach for more effective freight delivery and transportation process management. The road congestions and the identification of causes are important, as well as the context information recognition and management. The measure of many parameters during the transportation period and proper control of driver work became the problem. The number of vehicles per time unit passing at a given time and point for drivers can be evaluated in some situations. The collection of data is mainly used to establish new trips. The flow of the data is more complex in urban areas. Herein, the movement of freight is reported in detail, including the information on street level. When traffic density is extremely high in congestion cases, and the traffic speed is incredibly low, data transmission reaches the peak. Different data sets are generated, which depend on the type of freight delivery network. There are three types of networks: long-distance delivery networks, last-mile delivery networks and mode-based delivery networks; the last one includes different modes, in particular, railways and other networks. When freight delivery is switched from one type of the above-stated network to another, more data could be included for reporting purposes and vice versa. In this case, a significant amount of these data is used for control operations, and the problem requires an integrated methodological approach. The paper presents an approach for providing e-services for drivers by including the assessment of the multi-component infrastructure needed for delivery of freights following the network type. The construction of such a methodology is required to evaluate data flow conditions and overloads, and to minimize the time gaps in data reporting. The results obtained show the possibilities of the proposing methodological approach to support the management and decision-making processes with functionality of incorporating networking specifics, by helping to minimize the overloads in data reporting.

Keywords: transportation networks, freight delivery, data flow, monitoring, e-services

Procedia PDF Downloads 101
12887 Aerial Survey and 3D Scanning Technology Applied to the Survey of Cultural Heritage of Su-Paiwan, an Aboriginal Settlement, Taiwan

Authors: April Hueimin Lu, Liangj-Ju Yao, Jun-Tin Lin, Susan Siru Liu

Abstract:

This paper discusses the application of aerial survey technology and 3D laser scanning technology in the surveying and mapping work of the settlements and slate houses of the old Taiwanese aborigines. The relics of old Taiwanese aborigines with thousands of history are widely distributed in the deep mountains of Taiwan, with a vast area and inconvenient transportation. When constructing the basic data of cultural assets, it is necessary to apply new technology to carry out efficient and accurate settlement mapping work. In this paper, taking the old Paiwan as an example, the aerial survey of the settlement of about 5 hectares and the 3D laser scanning of a slate house were carried out. The obtained orthophoto image was used as an important basis for drawing the settlement map. This 3D landscape data of topography and buildings derived from the aerial survey is important for subsequent preservation planning as well as building 3D scan provides a more detailed record of architectural forms and materials. The 3D settlement data from the aerial survey can be further applied to the 3D virtual model and animation of the settlement for virtual presentation. The information from the 3D scanning of the slate house can also be used for further digital archives and data queries through network resources. The results of this study show that, in large-scale settlement surveys, aerial surveying technology is used to construct the topography of settlements with buildings and spatial information of landscape, as well as the application of 3D scanning for small-scale records of individual buildings. This application of 3D technology, greatly increasing the efficiency and accuracy of survey and mapping work of aboriginal settlements, is much helpful for further preservation planning and rejuvenation of aboriginal cultural heritage.

Keywords: aerial survey, 3D scanning, aboriginal settlement, settlement architecture cluster, ecological landscape area, old Paiwan settlements, slat house, photogrammetry, SfM, MVS), Point cloud, SIFT, DSM, 3D model

Procedia PDF Downloads 131
12886 The Traditional Roles and Place of Indigenous Musical Practices in Contemporary African Society

Authors: Benjamin Obeghare Izu

Abstract:

In Africa, indigenous musical practices are the focal point in which most cultural practices revolve, and they are the conduit mainly used in transmitting Indigenous knowledge and values. They serve as a means of documenting, preserving, transmitting indigenous knowledge, and re-enacting their historical, social, and cultural affinity. Indigenous musical practices also serve as a repository for indigenous knowledge and artistic traditions. However, these indigenous musical practices and the resulting cultural ideals are confronted with substantial challenges in the twenty-first century from contemporary cultural influence. Additionally, indigenous musical practices' educational and cultural purposes have been impacted by the broad monetisation of the arts in contemporary society. They are seen as objects of entertainment. Some young people are today unaware of their cultural roots and are losing their cultural identity due to these influences and challenges. In order to help policymakers raise awareness of and encourage the use of indigenous knowledge and musical practices among African youth and scholars, this study is in response to the need to explore the components and functions of the indigenous knowledge system, values, and musical tradition in Africa. The study employed qualitative research methods, utilising interviews, participant observation, and conducting related literature as data collection methods. It examines the indigenous musical practices in the Oba of Benin Royal Igue festival among the Benin people in Edo state, Nigeria, and the Ovwuwve festival observed by the Abraka people in Delta state, Nigeria. The extent to which the indigenous musical practices convey and protect indigenous knowledge and cultural values are reflected in the musical practices of the cultural festivals. The study looks at how indigenous musical arts are related to one another and how that affects how indigenous knowledge is transmitted and preserved. It makes recommendations for how to increase the use of indigenous knowledge and values and their fusion with contemporary culture. The study contributes significantly to ethnomusicology by showing how African traditional music traditions support other facets of culture and how indigenous knowledge might be helpful in contemporary society.

Keywords: African musical practices, African music and dance, African society, indigenous musical practices

Procedia PDF Downloads 88
12885 The Use of Additives to Prevent Fouling in Polyethylene and Polypropylene Gas and Slurry Phase Processes

Authors: L. Shafiq, A. Rigby

Abstract:

All polyethylene processes are highly exothermic, and the safe removal of the heat of reaction is a fundamental issue in the process design. In slurry and gas processes, the velocity of the polymer particles in the reactor and external coolers can be very high, and under certain conditions, this can lead to static charging of these particles. Such static charged polymer particles may start building up on the reactor wall, limiting heat transfer, and ultimately leading to severe reactor fouling and forced reactor shut down. Statsafe™ is an FDA approved anti-fouling additive currently used around the world for polyolefin production as an anti-fouling additive. The unique polymer chemistry aids static discharge, which prevents the build-up of charged polyolefin particles, which could lead to fouling. Statsafe™ is being used and trailed in gas, slurry, and a combination of these technologies around the world. We will share data to demonstrate how the use of Statsafe™ allows more stable operation at higher solids level by eliminating static, which would otherwise prevent closer packing of particles in the hydrocarbon slurry. Because static charge generation depends also on the concentration of polymer particles in the slurry, the maximum slurry concentration can be higher when using Statsafe™, leading to higher production rates. The elimination of fouling also leads to less downtime. Special focus will be made on the impact anti-static additives have on catalyst performance within the polymerization process and how this has been measured. Lab-scale studies have investigated the effect on the activity of Ziegler Natta catalysts when anti-static additives are used at various concentrations in gas and slurry, polyethylene and polypropylene processes. An in-depth gas phase study investigated the effect of additives on the final polyethylene properties such as particle size, morphology, fines, bulk density, melt flow index, gradient density, and melting point.

Keywords: anti-static additives, catalyst performance, FDA approved anti-fouling additive, polymerisation

Procedia PDF Downloads 171
12884 Implementation of an Economic – Probabilistic Model to Risk Analysis of ERP Project in Technological Innovation Firms – A Case Study of ICT Industry in Iran

Authors: Reza Heidari, Maryam Amiri

Abstract:

In a technological world, many countries have a tendency to fortifying their companies and technological infrastructures. Also, one of the most important requirements for developing technology is innovation, and then, all companies are struggling to consider innovation as a basic principle. Since, the expansion of a product need to combine different technologies, therefore, different innovative projects would be run in the firms as a base of technology development. In such an environment, enterprise resource planning (ERP) has special significance in order to develop and strengthen of innovations. In this article, an economic-probabilistic analysis was provided to perform an implementation project of ERP in the technological innovation (TI) based firms. The used model in this article assesses simultaneously both risk and economic analysis in view of the probability of each event that is jointly between economical approach and risk investigation approach. To provide an economic-probabilistic analysis of risk of the project, activities and milestones in the cash flow were extracted. Also, probability of occurrence of each of them was assessed. Since, Resources planning in an innovative firm is the object of this project. Therefore, we extracted various risks that are in relation with innovative project and then they were evaluated in the form of cash flow. This model, by considering risks affecting the project and the probability of each of them and assign them to the project's cash flow categories, presents an adjusted cash flow based on Net Present Value (NPV) and with probabilistic simulation approach. Indeed, this model presented economic analysis of the project based on risks-adjusted. Then, it measures NPV of the project, by concerning that these risks which have the most effect on technological innovation projects, and in the following measures probability associated with the NPV for each category. As a result of application of presented model in the information and communication technology (ICT) industry, provided an appropriate analysis of feasibility of the project from the point of view of cash flow based on risk impact on the project. Obtained results can be given to decision makers until they can practically have a systematically analysis of the possibility of the project with an economic approach and as moderated.

Keywords: cash flow categorization, economic evaluation, probabilistic, risk assessment, technological innovation

Procedia PDF Downloads 384
12883 Algae Growth and Biofilm Control by Ultrasonic Technology

Authors: Vojtech Stejskal, Hana Skalova, Petr Kvapil, George Hutchinson

Abstract:

Algae growth has been an important issue in water management of water plants, ponds and lakes, swimming pools, aquaculture & fish farms, gardens or golf courses for last decades. There are solutions based on chemical or biological principles. Apart of these traditional principles for inhibition of algae growth and biofilm production there are also physical methods which are very competitive compared to the traditional ones. Ultrasonic technology is one of these alternatives. Ultrasonic emitter is able to eliminate the biofilm which behaves as a host and attachment point for algae and is original reason for the algae growth. The ultrasound waves prevent majority of the bacteria in planktonic form becoming strongly attached sessile bacteria that creates welcoming layer for the biofilm production. Biofilm creation is very fast – in the serene water it takes between 30 minutes to 4 hours, depending on temperature and other parameters. Ultrasound device is not killing bacteria. Ultrasound waves are passing through bacteria, which retract as if they were in very turbulent water even though the water is visually completely serene. In these conditions, bacteria does not excrete the polysaccharide glue they use to attach to the surface of the pool or pond, where ultrasonic technology is used. Ultrasonic waves decrease the production of biofilm on the surfaces in the selected area. In case there are already at the start of the application of ultrasonic technology in a pond or basin clean inner surfaces, the biofilm production is almost absolutely inhibited. This paper talks about two different pilot applications – one in Czech Republic and second in United States of America, where the used ultrasonic technology (AlgaeControl) is coming from. On both sites, there was used Mezzo Ultrasonic Algae Control System with very positive results not only on biofilm production, but also algae growth in the surrounding area. Technology has been successfully tested in two different environments. The poster describes the differences and their influence on the efficiency of ultrasonic technology application. Conclusions and lessons learned can be possibly applied also on other sites within Europe or even further.

Keywords: algae growth, biofilm production, ultrasonic solution, ultrasound

Procedia PDF Downloads 237
12882 Contextual SenSe Model: Word Sense Disambiguation using Sense and Sense Value of Context Surrounding the Target

Authors: Vishal Raj, Noorhan Abbas

Abstract:

Ambiguity in NLP (Natural language processing) refers to the ability of a word, phrase, sentence, or text to have multiple meanings. This results in various kinds of ambiguities such as lexical, syntactic, semantic, anaphoric and referential am-biguities. This study is focused mainly on solving the issue of Lexical ambiguity. Word Sense Disambiguation (WSD) is an NLP technique that aims to resolve lexical ambiguity by determining the correct meaning of a word within a given context. Most WSD solutions rely on words for training and testing, but we have used lemma and Part of Speech (POS) tokens of words for training and testing. Lemma adds generality and POS adds properties of word into token. We have designed a novel method to create an affinity matrix to calculate the affinity be-tween any pair of lemma_POS (a token where lemma and POS of word are joined by underscore) of given training set. Additionally, we have devised an al-gorithm to create the sense clusters of tokens using affinity matrix under hierar-chy of POS of lemma. Furthermore, three different mechanisms to predict the sense of target word using the affinity/similarity value are devised. Each contex-tual token contributes to the sense of target word with some value and whichever sense gets higher value becomes the sense of target word. So, contextual tokens play a key role in creating sense clusters and predicting the sense of target word, hence, the model is named Contextual SenSe Model (CSM). CSM exhibits a noteworthy simplicity and explication lucidity in contrast to contemporary deep learning models characterized by intricacy, time-intensive processes, and chal-lenging explication. CSM is trained on SemCor training data and evaluated on SemEval test dataset. The results indicate that despite the naivety of the method, it achieves promising results when compared to the Most Frequent Sense (MFS) model.

Keywords: word sense disambiguation (wsd), contextual sense model (csm), most frequent sense (mfs), part of speech (pos), natural language processing (nlp), oov (out of vocabulary), lemma_pos (a token where lemma and pos of word are joined by underscore), information retrieval (ir), machine translation (mt)

Procedia PDF Downloads 78
12881 Assessing Overall Thermal Conductance Value of Low-Rise Residential Home Exterior Above-Grade Walls Using Infrared Thermography Methods

Authors: Matthew D. Baffa

Abstract:

Infrared thermography is a non-destructive test method used to estimate surface temperatures based on the amount of electromagnetic energy radiated by building envelope components. These surface temperatures are indicators of various qualitative building envelope deficiencies such as locations and extent of heat loss, thermal bridging, damaged or missing thermal insulation, air leakage, and moisture presence in roof, floor, and wall assemblies. Although infrared thermography is commonly used for qualitative deficiency detection in buildings, this study assesses its use as a quantitative method to estimate the overall thermal conductance value (U-value) of the exterior above-grade walls of a study home. The overall U-value of exterior above-grade walls in a home provides useful insight into the energy consumption and thermal comfort of a home. Three methodologies from the literature were employed to estimate the overall U-value by equating conductive heat loss through the exterior above-grade walls to the sum of convective and radiant heat losses of the walls. Outdoor infrared thermography field measurements of the exterior above-grade wall surface and reflective temperatures and emissivity values for various components of the exterior above-grade wall assemblies were carried out during winter months at the study home using a basic thermal imager device. The overall U-values estimated from each methodology from the literature using the recorded field measurements were compared to the nominal exterior above-grade wall overall U-value calculated from materials and dimensions detailed in architectural drawings of the study home. The nominal overall U-value was validated through calendarization and weather normalization of utility bills for the study home as well as various estimated heat loss quantities from a HOT2000 computer model of the study home and other methods. Under ideal environmental conditions, the estimated overall U-values deviated from the nominal overall U-value between ±2% to ±33%. This study suggests infrared thermography can estimate the overall U-value of exterior above-grade walls in low-rise residential homes with a fair amount of accuracy.

Keywords: emissivity, heat loss, infrared thermography, thermal conductance

Procedia PDF Downloads 285
12880 Corporate Governance and Corporate Social Responsibility: Research on the Interconnection of Both Concepts and Its Impact on Non-Profit Organizations

Authors: Helene Eller

Abstract:

The aim of non-profit organizations (NPO) is to provide services and goods for its clientele, with profit being a minor objective. By having this definition as the basic purpose of doing business, it is obvious that the goal of an organisation is to serve several bottom lines and not only the financial one. This approach is underpinned by the non-distribution constraint which means that NPO are allowed to make profits to a certain extent, but not to distribute them. The advantage is that there are no single shareholders who might have an interest in the prosperity of the organisation: there is no pie to divide. The gained profits remain within the organisation and will be reinvested in purposeful projects. Good governance is mandatory to support the aim of NPOs. Looking for a measure of good governance the principals of corporate governance (CG) will come in mind. The purpose of CG is direction and control, and in the field of NPO, CG is enlarged to consider the relationship to all important stakeholders who have an impact on the organisation. The recognition of more relevant parties than the shareholder is the link to corporate social responsibility (CSR). It supports a broader view of the bottom line: It is no longer enough to know how profits are used but rather how they are made. Besides, CSR addresses the responsibility of organisations for their impact on society. When transferring the concept of CSR to the non-profit area it will become obvious that CSR with its distinctive features will match the aims of NPOs. As a consequence, NPOs who apply CG apply also CSR to a certain extent. The research is designed as a comprehensive theoretical and empirical analysis. First, the investigation focuses on the theoretical basis of both concepts. Second, the similarities and differences are outlined and as a result the interconnection of both concepts will show up. The contribution of this research is manifold: The interconnection of both concepts when applied to NPOs has not got any attention in science yet. CSR and governance as integrated concept provides a lot of advantages for NPOs compared to for-profit organisations which are in a steady justification to show the impact they might have on the society. NPOs, however, integrate economic and social aspects as starting point. For NPOs CG is not a mere concept of compliance but rather an enhanced concept integrating a lot of aspects of CSR. There is no “either-nor” between the concepts for NPOs.

Keywords: business ethics, corporate governance, corporate social responsibility, non-profit organisations

Procedia PDF Downloads 216
12879 Modelling of Heat Transfer during Controlled Cooling of Thermo-Mechanically Treated Rebars Using Computational Fluid Dynamics Approach

Authors: Rohit Agarwal, Mrityunjay K. Singh, Soma Ghosh, Ramesh Shankar, Biswajit Ghosh, Vinay V. Mahashabde

Abstract:

Thermo-mechanical treatment (TMT) of rebars is a critical process to impart sufficient strength and ductility to rebar. TMT rebars are produced by the Tempcore process, involves an 'in-line' heat treatment in which hot rolled bar (temperature is around 1080°C) is passed through water boxes where it is quenched under high pressure water jets (temperature is around 25°C). The quenching rate dictates composite structure consisting (four non-homogenously distributed phases of rebar microstructure) pearlite-ferrite, bainite, and tempered martensite (from core to rim). The ferrite and pearlite phases present at core induce ductility to rebar while martensitic rim induces appropriate strength. The TMT process is difficult to model as it brings multitude of complex physics such as heat transfer, highly turbulent fluid flow, multicomponent and multiphase flow present in the control volume. Additionally the presence of film boiling regime (above Leidenfrost point) due to steam formation adds complexity to domain. A coupled heat transfer and fluid flow model based on computational fluid dynamics (CFD) has been developed at product technology division of Tata Steel, India which efficiently predicts temperature profile and percentage martensite rim thickness of rebar during quenching process. The model has been validated with 16 mm rolling of New Bar mill (NBM) plant of Tata Steel Limited, India. Furthermore, based on the scenario analyses, optimal configuration of nozzles was found which helped in subsequent increase in rolling speed.

Keywords: boiling, critical heat flux, nozzles, thermo-mechanical treatment

Procedia PDF Downloads 177
12878 Effect of Surfactant Level of Microemulsions and Nanoemulsions on Cell Viability

Authors: Sonal Gupta, Rakhi Bansal, Javed Ali, Reema Gabrani, Shweta Dang

Abstract:

Nanoemulsions (NEs) and microemulsions (MEs) have been an attractive tool for encapsulation of both hydrophilic and lipophillic actives. Both these systems are composed of oil phase, surfactant, co-surfactant and aqueous phase. Depending upon the application and intended use, both oil-in-water and water-in-oil emulsions can be designed. NEs are fabricated using high energy methods employing less percentage of surfactant as compared to MEs which are self assembled drug delivery systems. Owing to the nanometric size of the droplets these systems have been widely used to enhance solubility and bioavailability of natural as well as synthetic molecules. The aim of the present study is to assess the effect of % age of surfactants on cell viability of Vero cells (African Green Monkeys’ Kidney epithelial cells) via MTT assay. Green tea catechin (Polyphenon 60) loaded ME employing low energy vortexing and NE employing high energy ultrasonication were prepared using same excipients (labrasol as oil, cremophor EL as surfactant and glycerol as co-surfactant) however, the % age of oil and surfactant needed to prepare the ME was higher as compared to NE. These formulations along with their excipients (oilME=13.3%, SmixME=26.67%; oilNE=10%, SmixNE=13.52%) were added to Vero cells for 24 hrs. The tetrazolium dye, 3-(4,5-dimethylthia/ol-2-yl)-2,5-diphi-iiyltclrazolium bromide (MTT), is reduced by live cells and this reaction is used as the end point to evaluate the cytoxicity level of a test formulation. Results of MTT assay indicated that oil at different percentages exhibited almost equal cell viability (oilME ≅ oilNE) while surfactant mixture had a significant difference in the cell viability values (SmixME < SmixNE). Polyphenon 60 loaded ME and its PlaceboME showed higher toxicity as compared to Polyphenon 60 loaded NE and its PlaceboNE that can be attributed to the higher concentration of surfactants present in MEs. Another probable reason for high % cell viability of Polyphenon 60 loaded NE might be due to the effective release of Polyphenon 60 from NE formulation that helps in the sustenance of Vero cells.

Keywords: cell viability, microemulsion, MTT, nanoemulsion, surfactants, ultrasonication

Procedia PDF Downloads 406