Search results for: software testing
6141 Fe Modified Tin Oxide Thin Film Based Matrix for Reagentless Uric Acid Biosensing
Authors: Kashima Arora, Monika Tomar, Vinay Gupta
Abstract:
Biosensors have found potential applications ranging from environmental testing and biowarfare agent detection to clinical testing, health care, and cell analysis. This is driven in part by the desire to decrease the cost of health care and to obtain precise information more quickly about the health status of patient by the development of various biosensors, which has become increasingly prevalent in clinical testing and point of care testing for a wide range of biological elements. Uric acid is an important byproduct in human body and a number of pathological disorders are related to its high concentration in human body. In past few years, rapid growth in the development of new materials and improvements in sensing techniques have led to the evolution of advanced biosensors. In this context, metal oxide thin film based matrices due to their bio compatible nature, strong adsorption ability, high isoelectric point (IEP) and abundance in nature have become the materials of choice for recent technological advances in biotechnology. In the past few years, wide band-gap metal oxide semiconductors including ZnO, SnO₂ and CeO₂ have gained much attention as a matrix for immobilization of various biomolecules. Tin oxide (SnO₂), wide band gap semiconductor (Eg =3.87 eV), despite having multifunctional properties for broad range of applications including transparent electronics, gas sensors, acoustic devices, UV photodetectors, etc., it has not been explored much for biosensing purpose. To realize a high performance miniaturized biomolecular electronic device, rf sputtering technique is considered to be the most promising for the reproducible growth of good quality thin films, controlled surface morphology and desired film crystallization with improved electron transfer property. Recently, iron oxide and its composites have been widely used as matrix for biosensing application which exploits the electron communication feature of Fe, for the detection of various analytes using urea, hemoglobin, glucose, phenol, L-lactate, H₂O₂, etc. However, to the authors’ knowledge, no work is being reported on modifying the electronic properties of SnO₂ by implanting with suitable metal (Fe) to induce the redox couple in it and utilizing it for reagentless detection of uric acid. In present study, Fe implanted SnO₂ based matrix has been utilized for reagentless uric acid biosensor. Implantation of Fe into SnO₂ matrix is confirmed by energy-dispersive X-Ray spectroscopy (EDX) analysis. Electrochemical techniques have been used to study the response characteristics of Fe modified SnO₂ matrix before and after uricase immobilization. The developed uric acid biosensor exhibits a high sensitivity to about 0.21 mA/mM and a linear variation in current response over concentration range from 0.05 to 1.0 mM of uric acid besides high shelf life (~20 weeks). The Michaelis-Menten kinetic parameter (Km) is found to be relatively very low (0.23 mM), which indicates high affinity of the fabricated bioelectrode towards uric acid (analyte). Also, the presence of other interferents present in human serum has negligible effect on the performance of biosensor. Hence, obtained results highlight the importance of implanted Fe:SnO₂ thin film as an attractive matrix for realization of reagentless biosensors towards uric acid.Keywords: Fe implanted tin oxide, reagentless uric acid biosensor, rf sputtering, thin film
Procedia PDF Downloads 1816140 Testing for Endogeneity of Foreign Direct Investment: Implications for Economic Policy
Authors: Liwiusz Wojciechowski
Abstract:
Research background: The current knowledge does not give a clear answer to the question of the impact of FDI on productivity. Results of the empirical studies are still inconclusive, no matter how extensive and diverse in terms of research approaches or groups of countries analyzed they are. It should also take into account the possibility that FDI and productivity are linked and that there is a bidirectional relationship between them. This issue is particularly important because on one hand FDI can contribute to changes in productivity in the host country, but on the other hand its level and dynamics may imply that FDI should be undertaken in a given country. As already mentioned, a two-way relationship between the presence of foreign capital and productivity in the host country should be assumed, taking into consideration the endogenous nature of FDI. Purpose of the article: The overall objective of this study is to determine the causality between foreign direct investment and total factor productivity in host county in terms of different relative absorptive capacity across countries. In the classic sense causality among variables is not always obvious and requires for testing, which would facilitate proper specification of FDI models. The aim of this article is to study endogeneity of selected macroeconomic variables commonly being used in FDI models in case of Visegrad countries: main recipients of FDI in CEE. The findings may be helpful in determining the structure of the actual relationship between variables, in appropriate models estimation and in forecasting as well as economic policymaking. Methodology/methods: Panel and time-series data techniques including GMM estimator, VEC models and causality tests were utilized in this study. Findings & Value added: The obtained results allow to confirm the hypothesis states the bi-directional causality between FDI and total factor productivity. Although results differ from among countries and data level of aggregation implications may be useful for policymakers in case of providing foreign capital attracting policy.Keywords: endogeneity, foreign direct investment, multi-equation models, total factor productivity
Procedia PDF Downloads 1986139 Requirement Engineering for Intrusion Detection Systems in Wireless Sensor Networks
Authors: Afnan Al-Romi, Iman Al-Momani
Abstract:
The urge of applying the Software Engineering (SE) processes is both of vital importance and a key feature in critical, complex large-scale systems, for example, safety systems, security service systems, and network systems. Inevitably, associated with this are risks, such as system vulnerabilities and security threats. The probability of those risks increases in unsecured environments, such as wireless networks in general and in Wireless Sensor Networks (WSNs) in particular. WSN is a self-organizing network of sensor nodes connected by wireless links. WSNs consist of hundreds to thousands of low-power, low-cost, multi-function sensor nodes that are small in size and communicate over short-ranges. The distribution of sensor nodes in an open environment that could be unattended in addition to the resource constraints in terms of processing, storage and power, make such networks in stringent limitations such as lifetime (i.e. period of operation) and security. The importance of WSN applications that could be found in many militaries and civilian aspects has drawn the attention of many researchers to consider its security. To address this important issue and overcome one of the main challenges of WSNs, security solution systems have been developed by researchers. Those solutions are software-based network Intrusion Detection Systems (IDSs). However, it has been witnessed, that those developed IDSs are neither secure enough nor accurate to detect all malicious behaviours of attacks. Thus, the problem is the lack of coverage of all malicious behaviours in proposed IDSs, leading to unpleasant results, such as delays in the detection process, low detection accuracy, or even worse, leading to detection failure, as illustrated in the previous studies. Also, another problem is energy consumption in WSNs caused by IDS. So, in other words, not all requirements are implemented then traced. Moreover, neither all requirements are identified nor satisfied, as for some requirements have been compromised. The drawbacks in the current IDS are due to not following structured software development processes by researches and developers when developing IDS. Consequently, they resulted in inadequate requirement management, process, validation, and verification of requirements quality. Unfortunately, WSN and SE research communities have been mostly impermeable to each other. Integrating SE and WSNs is a real subject that will be expanded as technology evolves and spreads in industrial applications. Therefore, this paper will study the importance of Requirement Engineering when developing IDSs. Also, it will study a set of existed IDSs and illustrate the absence of Requirement Engineering and its effect. Then conclusions are drawn in regard of applying requirement engineering to systems to deliver the required functionalities, with respect to operational constraints, within an acceptable level of performance, accuracy and reliability.Keywords: software engineering, requirement engineering, Intrusion Detection System, IDS, Wireless Sensor Networks, WSN
Procedia PDF Downloads 3226138 Extending BDI Multiagent Systems with Agent Norms
Authors: Francisco José Plácido da Cunha, Tassio Ferenzini Martins Sirqueira, Marx Leles Viana, Carlos José Pereira de Lucena
Abstract:
Open Multiagent Systems (MASs) are societies in which heterogeneous and independently designed entities (agents) work towards similar, or different ends. Software agents are autonomous and the diversity of interests among different members living in the same society is a fact. In order to deal with this autonomy, these open systems use mechanisms of social control (norms) to ensure a desirable social order. This paper considers the following types of norms: (i) obligation — agents must accomplish a specific outcome; (ii) permission — agents may act in a particular way, and (iii) prohibition — agents must not act in a specific way. All of these characteristics mean to encourage the fulfillment of norms through rewards and to discourage norm violation by pointing out the punishments. Once the software agent decides that its priority is the satisfaction of its own desires and goals, each agent must evaluate the effects associated to the fulfillment of one or more norms before choosing which one should be fulfilled. The same applies when agents decide to violate a norm. This paper also introduces a framework for the development of MASs that provide support mechanisms to the agent’s decision-making, using norm-based reasoning. The applicability and validation of this approach is demonstrated applying a traffic intersection scenario.Keywords: BDI agent, BDI4JADE framework, multiagent systems, normative agents
Procedia PDF Downloads 2326137 Paddy/Rice Singulation for Determination of Husking Efficiency and Damage Using Machine Vision
Authors: M. Shaker, S. Minaei, M. H. Khoshtaghaza, A. Banakar, A. Jafari
Abstract:
In this study a system of machine vision and singulation was developed to separate paddy from rice and determine paddy husking and rice breakage percentages. The machine vision system consists of three main components including an imaging chamber, a digital camera, a computer equipped with image processing software. The singulation device consists of a kernel holding surface, a motor with vacuum fan, and a dimmer. For separation of paddy from rice (in the image), it was necessary to set a threshold. Therefore, some images of paddy and rice were sampled and the RGB values of the images were extracted using MATLAB software. Then mean and standard deviation of the data were determined. An Image processing algorithm was developed using MATLAB to determine paddy/rice separation and rice breakage and paddy husking percentages, using blue to red ratio. Tests showed that, a threshold of 0.75 is suitable for separating paddy from rice kernels. Results from the evaluation of the image processing algorithm showed that the accuracies obtained with the algorithm were 98.36% and 91.81% for paddy husking and rice breakage percentage, respectively. Analysis also showed that a suction of 45 mmHg to 50 mmHg yielding 81.3% separation efficiency is appropriate for operation of the kernel singulation system.Keywords: breakage, computer vision, husking, rice kernel
Procedia PDF Downloads 3816136 Comparative Study of Dose Calculation Accuracy in Bone Marrow Using Monte Carlo Method
Authors: Marzieh Jafarzadeh, Fatemeh Rezaee
Abstract:
Introduction: The effect of ionizing radiation on human health can be effective for genomic integrity and cell viability. It also increases the risk of cancer and malignancy. Therefore, X-ray behavior and absorption dose calculation are considered. One of the applicable tools for calculating and evaluating the absorption dose in human tissues is Monte Carlo simulation. Monte Carlo offers a straightforward way to simulate and integrate, and because it is simple and straightforward, Monte Carlo is easy to use. The Monte Carlo BEAMnrc code is one of the most common diagnostic X-ray simulation codes used in this study. Method: In one of the understudy hospitals, a certain number of CT scan images of patients who had previously been imaged were extracted from the hospital database. BEAMnrc software was used for simulation. The simulation of the head of the device with the energy of 0.09 MeV with 500 million particles was performed, and the output data obtained from the simulation was applied for phantom construction using CT CREATE software. The percentage of depth dose (PDD) was calculated using STATE DOSE was then compared with international standard values. Results and Discussion: The ratio of surface dose to depth dose (D/Ds) in the measured energy was estimated to be about 4% to 8% for bone and 3% to 7% for bone marrow. Conclusion: MC simulation is an efficient and accurate method for simulating bone marrow and calculating the absorbed dose.Keywords: Monte Carlo, absorption dose, BEAMnrc, bone marrow
Procedia PDF Downloads 2136135 Optimization of Wind Off-Grid System for Remote Area: Egyptian Application
Authors: Marwa M. Ibrahim
Abstract:
The objective of this research is to study the technical and economic performance of wind/diesel/battery (W/D/B) off-grid system supplying a small remote gathering of four families using the HOMER software package. The second objective is to study the effect of wind energy system on the cost of generated electricity considering the cost of reducing CO₂ emissions as external benefit of wind turbines, no pollutant emission through the operational phase. The system consists of a small wind turbine, battery storage, and diesel generator. The electrical energy is to cater to the basic needs for which the daily load pattern is estimated at 8 kW peak. Net Present Cost (NPC) and Cost of Energy (COE) are used as economic criteria, while the measure of performance is % of power shortage. Technical and economic parameters are defined to estimate the feasibility of the system under study. Optimum system configurations are estimated for the selected site in Egypt. Using HOMER software, the simulation results shows that W/D/B systems are economical for the assumed community site as the price of generated electricity is about 0.285 $/kWh, without taking external benefits into considerations and 0.221 if CO₂ emissions taken into consideration W/D/B systems are more economical than alone diesel system as the COE is 0.432 $/kWh for diesel alone.Keywords: renewable energy, hybrid energy system, on-off grid system, simulation, optimization and environmental impacts
Procedia PDF Downloads 1026134 Evaluation of Numerical Modeling of Jet Grouting Design Using in situ Loading Test
Authors: Reza Ziaie Moayed, Ehsan Azini
Abstract:
Jet grouting (JG) is one of the methods of improving and increasing the strength and bearing of soil in which the high pressure water or grout is injected through the nozzles into the soil. During this process, a part of the soil and grout particles comes out of the drill borehole, and the other part is mixed up with the grout in place, as a result of this process, a mass of modified soil is created. The purpose of this method is to change the soil into a mixture of soil and cement, commonly known as "soil-cement". In this paper, first, the principles of high pressure injection and then the effective parameters in the JG method are described. Then, the tests on the samples taken from the columns formed from the excavation around the soil-cement columns, as well as the static loading test on the created column, are discussed. In the other part of this paper, the soil behavior models for numerical modeling in PLAXIS software are mentioned. The purpose of this paper is to evaluate the results of numerical modeling based on in-situ static loading tests. The results indicate an acceptable agreement between the results of the tests mentioned and the modeling results. Also, modeling with this software as an appropriate option for technical feasibility can be used to soil improvement using JG.Keywords: jet grouting column, soil improvement, numerical modeling, in-situ loading test
Procedia PDF Downloads 1436133 Dynamic Test for Sway-Mode Buckling of Columns
Authors: Boris Blostotsky, Elia Efraim
Abstract:
Testing of columns in sway mode is performed in order to determine the maximal allowable load limited by plastic deformations or their end connections and a critical load limited by columns stability. Motivation to determine accurate value of critical force is caused by its using as follow: - critical load is maximal allowable load for given column configuration and can be used as criterion of perfection; - it is used in calculation prescribed by standards for design of structural elements under combined action of compression and bending; - it is used for verification of theoretical analysis of stability at various end conditions of columns. In the present work a new non-destructive method for determination of columns critical buckling load in sway mode is proposed. The method allows performing measurements during the tests under loads that exceeds the columns critical load without losing its stability. The possibility of such loading is achieved by structure of the loading system. The system is performed as frame with rigid girder, one of the columns is the tested column and the other is additional two-hinged strut. Loading of the frame is carried out by the flexible traction element attached to the girder. The load applied on the tested column can achieve a values that exceed the critical load by choice of parameters of the traction element and the additional strut. The system lateral stiffness and the column critical load are obtained by the dynamic method. The experiment planning and the comparison between the experimental and theoretical values were performed based on the developed dependency of lateral stiffness of the system on vertical load, taking into account a semi-rigid connections of the column's ends. The agreement between the obtained results was established. The method can be used for testing of real full-size columns in industrial conditions.Keywords: buckling, columns, dynamic method, semi-rigid connections, sway mode
Procedia PDF Downloads 3136132 Designing a Model for Preparing Reports on the Automatic Earned Value Management Progress by the Integration of Primavera P6, SQL Database, and Power BI: A Case Study of a Six-Storey Concrete Building in Mashhad, Iran
Authors: Hamed Zolfaghari, Mojtaba Kord
Abstract:
Project planners and controllers are frequently faced with the challenge of inadequate software for the preparation of automatic project progress reports based on actual project information updates. They usually make dashboards in Microsoft Excel, which is local and not applicable online. Another shortcoming is that it is not linked to planning software such as Microsoft Project, which lacks the database required for data storage. This study aimed to propose a model for the preparation of reports on automatic online project progress based on actual project information updates by the integration of Primavera P6, SQL database, and Power BI for a construction project. The designed model could be applicable to project planners and controller agents by enabling them to prepare project reports automatically and immediately after updating the project schedule using actual information. To develop the model, the data were entered into P6, and the information was stored on the SQL database. The proposed model could prepare a wide range of reports, such as earned value management, HR reports, and financial, physical, and risk reports automatically on the Power BI application. Furthermore, the reports could be published and shared online.Keywords: primavera P6, SQL, Power BI, EVM, integration management
Procedia PDF Downloads 1086131 A Simple Device for Characterizing High Power Electron Beams for Welding
Authors: Aman Kaur, Colin Ribton, Wamadeva Balachandaran
Abstract:
Electron beam welding due to its inherent advantages is being extensively used for material processing where high precision is required. Especially in aerospace or nuclear industries, there are high quality requirements and the cost of materials and processes is very high which makes it very important to ensure the beam quality is maintained and checked prior to carrying out the welds. Although the processes in these industries are highly controlled, however, even the minor changes in the operating parameters of the electron gun can make large enough variations in the beam quality that can result in poor welding. To measure the beam quality a simple device has been designed that can be used at high powers. The device consists of two slits in x and y axis which collects a small portion of the beam current when the beam is deflected over the slits. The signals received from the device are processed in data acquisition hardware and the dedicated software developed for the device. The device has been used in controlled laboratory environments to analyse the signals and the weld quality relationships by varying the focus current. The results showed matching trends in the weld dimensions and the beam characteristics. Further experimental work is being carried out to determine the ability of the device and signal processing software to detect subtle changes in the beam quality and to relate these to the physical weld quality indicators.Keywords: electron beam welding, beam quality, high power, weld quality indicators
Procedia PDF Downloads 3246130 Investigation of the Impact of Family Status and Blood Group on Individuals’ Addiction
Authors: Masoud Abbasalipour
Abstract:
In this study, the impact of family status on individuals, involving factors such as parents' literacy level, family size, individuals' blood group, and susceptibility to addiction, was investigated. Statistical tests were employed to scrutinize the relationships among these specified factors. The statistical population of the study consisted of 338 samples divided into two groups: individuals with addiction and those without addiction in the city of Amol. The addicted group was selected from individuals visiting the substance abuse treatment center in Amol, and the non-addicted group was randomly selected from individuals in urban and rural areas. The Chi-square test was used to examine the presence or absence of relationships among the variables, and Kramer's V test was employed to determine the strength of the relationship between them. Excel software facilitated the initial entry of data, and SPSS software was utilized for the desired statistical tests. The research results indicated a significant relationship between the variable of parents' education level and individuals' addiction. The analysis showed that the education level of their parents was significantly lower compared to non-addicted individuals. However, the variables of the number of family members and blood group did not significantly impact individuals' susceptibility to addiction.Keywords: addiction, blood group, parents' literacy level, family status
Procedia PDF Downloads 696129 Evaluation of Commercial Back-analysis Package in Condition Assessment of Railways
Authors: Shadi Fathi, Moura Mehravar, Mujib Rahman
Abstract:
Over the years,increased demands on railways, the emergence of high-speed trains and heavy axle loads, ageing, and deterioration of the existing tracks, is imposing costly maintenance actions on the railway sector. The need for developing a fast andcost-efficient non-destructive assessment method for the structural evaluation of railway tracksis therefore critically important. The layer modulus is the main parameter used in the structural design and evaluation of the railway track substructure (foundation). Among many recently developed NDTs, Falling Weight Deflectometer (FWD) test, widely used in pavement evaluation, has shown promising results for railway track substructure monitoring. The surface deflection data collected by FWD are used to estimate the modulus of substructure layers through the back-analysis technique. Although there are different commerciallyavailableback-analysis programs are used for pavement applications, there are onlya limited number of research-based techniques have been so far developed for railway track evaluation. In this paper, the suitability, accuracy, and reliability of the BAKFAAsoftware are investigated. The main rationale for selecting BAKFAA as it has a relatively straightforward user interfacethat is freely available and widely used in highway and airport pavement evaluation. As part of the study, a finite element (FE) model of a railway track section near Leominsterstation, Herefordshire, UK subjected to the FWD test, was developed and validated against available field data. Then, a virtual experimental database (including 218 sets of FWD testing data) was generated using theFE model and employed as the measured database for the BAKFAA software. This database was generated considering various layers’ moduli for each layer of track substructure over a predefined range. The BAKFAA predictions were compared against the cone penetration test (CPT) data (available from literature; conducted near to Leominster station same section as the FWD was performed). The results reveal that BAKFAA overestimatesthe layers’ moduli of each substructure layer. To adjust the BAKFA with the CPT data, this study introduces a correlation model to make the BAKFAA applicable in railway applications.Keywords: back-analysis, bakfaa, railway track substructure, falling weight deflectometer (FWD), cone penetration test (CPT)
Procedia PDF Downloads 1296128 THRAP2 Gene Identified as a Candidate Susceptibility Gene of Thyroid Autoimmune Diseases Pedigree in Tunisian Population
Authors: Ghazi Chabchoub, Mouna Feki, Mohamed Abid, Hammadi Ayadi
Abstract:
Autoimmune thyroid diseases (AITDs), including Graves’ disease (GD) and Hashimoto’s thyroiditis (HT), are inherited as complex traits. Genetic factors associated with AITDs have been tentatively identified by candidate gene and genome scanning approaches. We analysed three intragenic microsatellite markers in the thyroid hormone receptor associated protein 2 gene (THRAP2), mapped near D12S79 marker, which have a potential role in immune function and inflammation [THRAP2-1(TG)n, THRAP2-2 (AC)n and THRAP2-3 (AC)n]. Our study population concerned 12 patients affected with AITDs belonging to a multiplex Tunisian family with high prevalence of AITDs. Fluorescent genotyping was carried out on ABI 3100 sequencers (Applied Biosystems USA) with the use of GENESCAN for semi-automated fragment sizing and GENOTYPER peak-calling software. Statistical analysis was performed using the non parametric Lod score (NPL) by Merlin software. Merlin outputs non-parametric NPLall (Z) and LOD scores and their corresponding asymptotic P values. The analysis for three intragenic markers in the THRAP2 gene revealed strong evidence for linkage (NPL=3.68, P=0.00012). Our results suggested the possible role of THRAP2 gene in AITDs susceptibility in this family.Keywords: autoimmunity, autoimmune disease, genetic, linkage analysis
Procedia PDF Downloads 1266127 Review of Currently Adopted Intelligent Programming Tutors
Authors: Rita Garcia
Abstract:
Intelligent Programming Tutors, IPTs, are supplemental educational devices that assist in teaching software development. These systems provide customized learning allowing the user to select the presentation pace, pedagogical strategy, and to recall previous and additional teaching materials reinforcing learning objectives. In addition, IPTs automatically records individual’s progress, providing feedback to the instructor and student. These tutoring systems have an advantage over Tutoring Systems because Intelligent Programming Tutors are not limited to one teaching strategy and can adjust when it detects the user struggling with a concept. The Intelligent Programming Tutor is a category of Intelligent Tutoring Systems, ITS. ITS are available for many fields in education, supporting different learning objectives and integrate into other learning tools, improving the student's learning experience. This study provides a comparison of the IPTs currently adopted by the educational community and will focus on the different teaching methodologies and programming languages. The study also includes the ability to integrate the IPT into other educational technologies, such as massive open online courses, MOOCs. The intention of this evaluation is to determine one system that would best serve in a larger ongoing research project and provide findings for other institutions looking to adopt an Intelligent Programming Tutor.Keywords: computer education tools, integrated software development assistance, intelligent programming tutors, tutoring systems
Procedia PDF Downloads 3176126 Investigating the Effective Physical Factors in the Development of Coastal Ecotourism in Southern Islands of Iran: A Case Study of Hendurabi Island, Iran
Authors: Zahra Khodaee
Abstract:
Background and Objective: The attractive potential for tourism in the southern islands of Iran, Kish, and Qeshm and recently Hendurabi, are becoming more and more popular and object of increased attention from the investors. The Iranian coral reef islands, with the exception of Kish and Qeshm, have not undergone sufficient development. The southern islands of Iran have faced two problems with climate change and the desire for the presence of tourists. The lack of proper planning, inefficient management, and lack of adequate knowledge of ecosystems of offshore regions have severely damaged the world natural heritage. This study was conducted to consider the correlation of tourism, development, and ecosystem because there is a need for further addressing the ecotourism in coral islands. Method: Through qualitative research, this paper was used of library studies and field studies and surveying to study the physical (objective-subjective) physical factors of ecotourism development in Honduran Island. Using SPSS software and descriptive-analytical method was shown the results. The survey was conducted with the participation of 150 tourists on Kish islands, who were chosen at random and who expressed their desire to travel to Hendurabi Island. Information was gathered using SPSS software and unique statistical T-test. The questionnaire was put together using AMOS software to ensure that the questions asked were sufficiently relevant. Findings: The results of this study presented that physical factors affecting the development of ecotourism in two categories are objective and subjective factors because IFI factor = 0.911 and CFI Factor = 0.907 into the target community. Discussion and conclusion: The results were satisfactory in that they showed that eco-tourists attached importance to see views, quiet, secluded areas, tranquility security, quality of the area being visited, easy access to services these were the top criteria for those visiting the area while they adhere to environmental compliance. Developing Management of these regions should maintain appropriate utilization along with sustainable and ecological responsibility.Keywords: ecotourism, coral reef island, development management, Hendurabi Island
Procedia PDF Downloads 1426125 Aging Evaluation of Ammonium Perchlorate/Hydroxyl Terminated Polybutadiene-Based Solid Rocket Engine by Reactive Molecular Dynamics Simulation and Thermal Analysis
Authors: R. F. B. Gonçalves, E. N. Iwama, J. A. F. F. Rocco, K. Iha
Abstract:
Propellants based on Hydroxyl Terminated Polybutadiene/Ammonium Perchlorate (HTPB/AP) are the most commonly used in most of the rocket engines used by the Brazilian Armed Forces. This work aimed at the possibility of extending its useful life (currently in 10 years) by performing kinetic-chemical analyzes of its energetic material via Differential Scanning Calorimetry (DSC) and also performing computer simulation of aging process using the software Large-scale Atomic/Molecular Massively Parallel Simulator (LAMMPS). Thermal analysis via DSC was performed in triplicates and in three heating ratios (5 ºC, 10 ºC, and 15 ºC) of rocket motor with 11 years shelf-life, using the Arrhenius equation to obtain its activation energy, using Ozawa and Kissinger kinetic methods, allowing comparison with manufacturing period data (standard motor). In addition, the kinetic parameters of internal pressure of the combustion chamber in 08 rocket engines with 11 years of shelf-life were also acquired, for comparison purposes with the engine start-up data.Keywords: shelf-life, thermal analysis, Ozawa method, Kissinger method, LAMMPS software, thrust
Procedia PDF Downloads 1276124 Reactivation of Hydrated Cement and Recycled Concrete Powder by Thermal Treatment for Partial Replacement of Virgin Cement
Authors: Gustave Semugaza, Anne Zora Gierth, Tommy Mielke, Marianela Escobar Castillo, Nat Doru C. Lupascu
Abstract:
The generation of Construction and Demolition Waste (CDW) has globally increased enormously due to the enhanced need in construction, renovation, and demolition of construction structures. Several studies investigated the use of CDW materials in the production of new concrete and indicated the lower mechanical properties of the resulting concrete. Many other researchers considered the possibility of using the Hydrated Cement Powder (HCP) to replace a part of Ordinary Portland Cement (OPC), but only very few investigated the use of Recycled Concrete Powder (RCP) from CDW. The partial replacement of OPC for making new concrete intends to decrease the CO₂ emissions associated with OPC production. However, the RCP and HCP need treatment to produce the new concrete of required mechanical properties. The thermal treatment method has proven to improve HCP properties before their use. Previous research has stated that for using HCP in concrete, the optimum results are achievable by heating HCP between 400°C and 800°C. The optimum heating temperature depends on the type of cement used to make the Hydrated Cement Specimens (HCS), the crushing and heating method of HCP, and the curing method of the Rehydrated Cement Specimens (RCS). This research assessed the quality of recycled materials by using different techniques such as X-ray Diffraction (XRD), Differential Scanning Calorimetry (DSC) and thermogravimetry (TG), Scanning electron Microscopy (SEM), and X-ray Fluorescence (XRF). These recycled materials were thermally pretreated at different temperatures from 200°C to 1000°C. Additionally, the research investigated to what extent the thermally treated recycled cement could partially replace the OPC and if the new concrete produced would achieve the required mechanical properties. The mechanical properties were evaluated on the RCS, obtained by mixing the Dehydrated Cement Powder and Recycled Powder (DCP and DRP) with water (w/c = 0.6 and w/c = 0.45). The research used the compressive testing machine for compressive strength testing, and the three-point bending test was used to assess the flexural strength.Keywords: hydrated cement powder, dehydrated cement powder, recycled concrete powder, thermal treatment, reactivation, mechanical performance
Procedia PDF Downloads 1536123 Development and Testing of an Instrument to Measure Beliefs about Cervical Cancer Screening among Women in Botswana
Authors: Ditsapelo M. McFarland
Abstract:
Background: Despite the availability of the Pap smear services in urban areas in Botswana, most women in such areas do not seem to screen regular for prevention of the cervical cancer disease. Reasons for non-use of the available Pap smear services are not well understood. Beliefs about cancer may influence participation in cancer screening in these women. The purpose of this study was to develop an instrument to measure beliefs about cervical cancer and Pap smear screening among Black women in Botswana, and evaluate the psychometric properties of the instrument. Significance: Instruments that are designed to measure beliefs about cervical cancer and screening among black women in Botswana, as well as in the surrounding region, are presently not available. Valid and reliable instruments are needed for exploration of the women’s beliefs about cervical cancer. Conceptual Framework: The Health Belief Model (HBM) provided a conceptual framework for the study. Methodology: The study was done in four phases: Phase 1: item generation: 15 items were generated from literature review and qualitative data for each of four conceptually defined HBM constructs: Perceived susceptibility, severity, benefits, and barriers (Version 1). Phase 2: content validity: Four experts who were advanced practice nurses of African descent and were familiar with the content and the HBM evaluated the content. Experts rated the items on a 4-point Likert scale ranging from: 1=not relevant, 2=somewhat relevant, 3=relevant and 4=very relevant. Fifty-five items were retained for instrument development: perceived susceptibility - 11, severity - 14, benefits - 15 and barriers - 15, all measuring on a 4-point Likert scale ranging from strongly disagree (1) to strongly agree (4). (Version 2). Phase 3: pilot testing: The instrument was pilot tested on a convenient sample of 30 women in Botswana and revised as needed. Phase 4: reliability: the revised instrument (Version 3) was submitted to a larger sample of women in Botswana (n=300) for reliability testing. The sample included women who were Batswana by birth and decent, were aged 30 years and above and could complete an English questionnaire. Data were collected with the assistance of trained research assistants. Major findings: confirmatory factor analysis of the 55 items found that a number of items did not adequately load in a four-factor solution. Items that exhibited reasonable reliability and had low frequency of missing values (n=36) were retained: perceived barriers (14 items), perceived benefits (8 items), perceived severity (4 items), and perceived susceptibility (10 items). confirmatory factor analysis (principle components) for a four factor solution using varimax rotation demonstrated that these four factors explained 43% of the variation in these 36 items. Conclusion: reliability analysis using Cronbach’s Alpha gave generally satisfactory results with values from 0.53 to 0.89.Keywords: cervical cancer, factor analysis, psychometric evaluation, varimax rotation
Procedia PDF Downloads 1266122 Settlement Prediction in Cape Flats Sands Using Shear Wave Velocity – Penetration Resistance Correlations
Authors: Nanine Fouche
Abstract:
The Cape Flats is a low-lying sand-covered expanse of approximately 460 square kilometres, situated to the southeast of the central business district of Cape Town in the Western Cape of South Africa. The aeolian sands masking this area are often loose and compressible in the upper 1m to 1.5m of the surface, and there is a general exceedance of the maximum allowable settlement in these sands. The settlement of shallow foundations on Cape Flats sands is commonly predicted using the results of in-situ tests such as the SPT or DPSH due to the difficulty of retrieving undisturbed samples for laboratory testing. Varying degrees of accuracy and reliability are associated with these methods. More recently, shear wave velocity (Vs) profiles obtained from seismic testing, such as continuous surface wave tests (CSW), are being used for settlement prediction. Such predictions have the advantage of considering non-linear stress-strain behaviour of soil and the degradation of stiffness with increasing strain. CSW tests are rarely executed in the Cape Flats, whereas SPT’s are commonly performed. For this reason, and to facilitate better settlement predictions in Cape Flats sand, equations representing shear wave velocity (Vs) as a function of SPT blow count (N60) and vertical effective stress (v’) were generated by statistical regression of site investigation data. To reveal the most appropriate method of overburden correction, analyses were performed with a separate overburden term (Pa/σ’v) as well as using stress corrected shear wave velocity and SPT blow counts (correcting Vs. and N60 to Vs1and (N1)60respectively). Shear wave velocity profiles and SPT blow count data from three sites masked by Cape Flats sands were utilised to generate 80 Vs-SPT N data pairs for analysis. Investigated terrains included sites in the suburbs of Athlone, Muizenburg, and Atlantis, all underlain by windblown deposits comprising fine and medium sand with varying fines contents. Elastic settlement analysis was also undertaken for the Cape Flats sands, using a non-linear stepwise method based on small-strain stiffness estimates, which was obtained from the best Vs-N60 model and compared to settlement estimates using the general elastic solution with stiffness profiles determined using Stroud’s (1989) and Webb’s (1969) SPT N60-E transformation models. Stroud’s method considers strain level indirectly whereasWebb’smethod does not take account of the variation in elastic modulus with strain. The expression of Vs. in terms of N60 and Pa/σv’ derived from the Atlantis data set revealed the best fit with R2 = 0.83 and a standard error of 83.5m/s. Less accurate Vs-SPT N relations associated with the combined data set is presumably the result of inversion routines used in the analysis of the CSW results showcasing significant variation in relative density and stiffness with depth. The regression analyses revealed that the inclusion of a separate overburden term in the regression of Vs and N60, produces improved fits, as opposed to the stress corrected equations in which the R2 of the regression is notably lower. It is the correction of Vs and N60 to Vs1 and (N1)60 with empirical constants ‘n’ and ‘m’ prior to regression, that introduces bias with respect to overburden pressure. When comparing settlement prediction methods, both Stroud’s method (considering strain level indirectly) and the small strain stiffness method predict higher stiffnesses for medium dense and dense profiles than Webb’s method, which takes no account of strain level in the determination of soil stiffness. Webb’s method appears to be suitable for loose sands only. The Versak software appears to underestimate differences in settlement between square and strip footings of similar width. In conclusion, settlement analysis using small-strain stiffness data from the proposed Vs-N60 model for Cape Flats sands provides a way to take account of the non-linear stress-strain behaviour of the sands when calculating settlement.Keywords: sands, settlement prediction, continuous surface wave test, small-strain stiffness, shear wave velocity, penetration resistance
Procedia PDF Downloads 1756121 Using Industrial Service Quality to Assess Service Quality Perception in Television Advertisement: A Case Study
Authors: Ana L. Martins, Rita S. Saraiva, João C. Ferreira
Abstract:
Much effort has been placed on the assessment of perceived service quality. Several models can be found in literature, but these are mainly focused on business-to-consumer (B2C) relationships. Literature on how to assess perceived quality in business-to-business (B2B) contexts is scarce both conceptually and in terms of its application. This research aims at filling this gap in literature by applying INDSERV to a case study situation. Under this scope, this research aims at analyzing the adequacy of the proposed assessment tool to other context besides the one where it was developed and by doing so analyzing the perceive quality of the advertisement service provided by a specific television network to its B2B customers. The INDSERV scale was adopted and applied to a sample of 33 clients, via questionnaires adapted to interviews. Data was collected in person or phone. Both quantitative and qualitative data collection was performed. Qualitative data analysis followed content analysis protocol. Quantitative analysis used hypotheses testing. Findings allowed to conclude that the perceived quality of the television service provided by television network is very positive, being the Soft Process Quality the parameter that reveals the highest perceived quality of the service as opposed to Potential Quality. To this end, some comments and suggestions were made by the clients regarding each one of these service quality parameters. Based on the hypotheses testing, it was noticed that only advertisement clients that maintain a connection to the television network from 5 to 10 years do show a significant different perception of the TV advertisement service provided by the company in what the Hard Process Quality parameter is concerned. Through the collected data content analysis, it was possible to obtain the percentage of clients which share the same opinions and suggestions for improvement. Finally, based on one of the four service quality parameter in a B2B context, managerial suggestions were developed aiming at improving the television network advertisement perceived quality service.Keywords: B2B, case study, INDSERV, perceived service quality
Procedia PDF Downloads 2066120 The Use of Hearing Protection Devices and Hearing Loss in Steel Industry Workers in Samut Prakan Province, Thailand
Authors: Petcharat Kerdonfag, Surasak Taneepanichskul, Winai Wadwongtham
Abstract:
Background: Although there have not been effective treatments for Noise Induced Hearing Loss (NIHL), it can be definitely preventable with promoting the use of Hearing Protection devices (HPDs) among workers who have been exposed to excessive noise for a long period. Objectives: The objectives of this study were to explore the use of HPDs among steel industrial workers in the high noise level zone in Samut Prakan province, Thailand and to examine the relationships of the HPDs use and hearing loss. Materials and Methods: In this cross-sectional study, eligible ninety-three participants were recruited in the designated zone of higher noise (> 85dBA) of two factories, using simple random sampling. The use of HPDs was gathered by the self-record form, examined and confirmed by the researcher team. Hearing loss was assessed by the audiometric screening at the regional Samut Prakan hospital. If an average threshold level exceeds 25 dBA at high frequency (4 and 6 Hz) in each ear, participants would be lost of hearing. Data were collected from October to December, 2016. All participants were examined by the same examiners for the validity. An Audiometric testing was performed with the participants who have been exposed to high noise levels at least 14 hours from workplace. Results: Sixty participants (64.5%) had secondary level of education. The average mean score of percent time of using HPDs was 60.5% (SD = 25.34). Sixty-seven participants (72.0%) had abnormal hearing which they have still needed to increase lower percent time of using HPDs (Mean = 37.01, SD = 23.81) than those having normal hearing (Mean = 45.77, SD = 28.44). However, there was no difference in the mean average of percent time of using HPDs between these two groups.Conclusion: The findings of this study have confirmed that the steel industrial workers still need to be motivated to use HPDs regularly. Future research should pay more attentions for creating a meaningful innovation to steel industrial workers.Keywords: hearing protection devices, noise induced hearing loss, audiometric testing, steel industry
Procedia PDF Downloads 2566119 Fabrication of Antimicrobial Dental Model Using Digital Light Processing (DLP) Integrated with 3D-Bioprinting Technology
Authors: Rana Mohamed, Ahmed E. Gomaa, Gehan Safwat, Ayman Diab
Abstract:
Background: Bio-fabrication is a multidisciplinary research field that combines several principles, fabrication techniques, and protocols from different fields. The open-source-software movement is a movement that supports the use of open-source licenses for some or all software as part of the broader notion of open collaboration. Additive manufacturing is the concept of 3D printing, where it is a manufacturing method through adding layer-by-layer using computer-aided designs (CAD). There are several types of AM system used, and they can be categorized by the type of process used. One of these AM technologies is Digital light processing (DLP) which is a 3D printing technology used to rapidly cure a photopolymer resin to create hard scaffolds. DLP uses a projected light source to cure (Harden or crosslinking) the entire layer at once. Current applications of DLP are focused on dental and medical applications. Other developments have been made in this field, leading to the revolutionary field 3D bioprinting. The open-source movement was started to spread the concept of open-source software to provide software or hardware that is cheaper, reliable, and has better quality. Objective: Modification of desktop 3D printer into 3D bio-printer and the integration of DLP technology and bio-fabrication to produce an antibacterial dental model. Method: Modification of a desktop 3D printer into a 3D bioprinter. Gelatin hydrogel and sodium alginate hydrogel were prepared with different concentrations. Rhizome of Zingiber officinale, Flower buds of Syzygium aromaticum, and Bulbs of Allium sativum were extracted, and extractions were selected on different levels (Powder, aqueous extracts, total oils, and Essential oils) prepared for antibacterial bioactivity. Agar well diffusion method along with the E. coli have been used to perform the sensitivity test for the antibacterial activity of the extracts acquired by Zingiber officinale, Syzygium aromaticum, and Allium sativum. Lastly, DLP printing was performed to produce several dental models with the natural extracted combined with hydrogel to represent and simulate the Hard and Soft tissues. Result: The desktop 3D printer was modified into 3D bioprinter using open-source software Marline and modified custom-made 3D printed parts. Sodium alginate hydrogel and gelatin hydrogel were prepared at 5% (w/v), 10% (w/v), and 15%(w/v). Resin integration with the natural extracts of Rhizome of Zingiber officinale, Flower buds of Syzygium aromaticum, and Bulbs of Allium sativum was done following the percentage 1- 3% for each extract. Finally, the Antimicrobial dental model was printed; exhibits the antimicrobial activity, followed by merging with sodium alginate hydrogel. Conclusion: The open-source movement was successful in modifying and producing a low-cost Desktop 3D Bioprinter showing the potential of further enhancement in such scope. Additionally, the potential of integrating the DLP technology with bioprinting is a promising step toward the usage of the antimicrobial activity using natural products.Keywords: 3D printing, 3D bio-printing, DLP, hydrogel, antibacterial activity, zingiber officinale, syzygium aromaticum, allium sativum, panax ginseng, dental applications
Procedia PDF Downloads 946118 Numerical Modelling and Soil-structure Interaction Analysis of Rigid Ballast-less and Flexible Ballast-based High-speed Rail Track-embankments Using Software
Authors: Tokirhusen Iqbalbhai Shaikh, M. V. Shah
Abstract:
With an increase in travel demand and a reduction in travel time, high-speed rail (HSR) has been introduced in India. Simplified 3-D finite element modelling is necessary to predict the stability and deformation characteristics of railway embankments and soil structure interaction behaviour under high-speed design requirements for Indian soil conditions. The objective of this study is to analyse the rigid ballast-less and flexible ballast-based high speed rail track embankments for various critical conditions subjected to them, viz. static condition, moving train condition, sudden brake application, and derailment case, using software. The input parameters for the analysis are soil type, thickness of the relevant strata, unit weight, Young’s modulus, Poisson’s ratio, undrained cohesion, friction angle, dilatancy angle, modulus of subgrade reaction, design speed, and other anticipated, relevant data. Eurocode 1, IRS-004(D), IS 1343, IRS specifications, California high-speed rail technical specifications, and the NHSRCL feasibility report will be followed in this study.Keywords: soil structure interaction, high speed rail, numerical modelling, PLAXIS3D
Procedia PDF Downloads 1106117 Traditional Drawing, BIM and Erudite Design Process
Authors: Maryam Kalkatechi
Abstract:
Nowadays, parametric design, scientific analysis, and digital fabrication are dominant. Many architectural practices are increasingly seeking to incorporate advanced digital software and fabrication in their projects. Proposing an erudite design process that combines digital and practical aspects in a strong frame within the method was resulted from the dissertation research. The digital aspects are the progressive advancements in algorithm design and simulation software. These aspects have assisted the firms to develop more holistic concepts at the early stage and maintain collaboration among disciplines during the design process. The erudite design process enhances the current design processes by encouraging the designer to implement the construction and architecture knowledge within the algorithm to make successful design processes. The erudite design process also involves the ongoing improvements of applying the new method of 3D printing in construction. This is achieved through the ‘data-sketches’. The term ‘data-sketch’ was developed by the author in the dissertation that was recently completed. It accommodates the decisions of the architect on the algorithm. This paper introduces the erudite design process and its components. It will summarize the application of this process in development of the ‘3D printed construction unit’. This paper contributes to overlaying the academic and practice with advanced technology by presenting a design process that transfers the dominance of tool to the learned architect and encourages innovation in design processes.Keywords: erudite, data-sketch, algorithm design in architecture, design process
Procedia PDF Downloads 2756116 Accurate Position Electromagnetic Sensor Using Data Acquisition System
Authors: Z. Ezzouine, A. Nakheli
Abstract:
This paper presents a high position electromagnetic sensor system (HPESS) that is applicable for moving object detection. The authors have developed a high-performance position sensor prototype dedicated to students’ laboratory. The challenge was to obtain a highly accurate and real-time sensor that is able to calculate position, length or displacement. An electromagnetic solution based on a two coil induction principal was adopted. The HPESS converts mechanical motion to electric energy with direct contact. The output signal can then be fed to an electronic circuit. The voltage output change from the sensor is captured by data acquisition system using LabVIEW software. The displacement of the moving object is determined. The measured data are transmitted to a PC in real-time via a DAQ (NI USB -6281). This paper also describes the data acquisition analysis and the conditioning card developed specially for sensor signal monitoring. The data is then recorded and viewed using a user interface written using National Instrument LabVIEW software. On-line displays of time and voltage of the sensor signal provide a user-friendly data acquisition interface. The sensor provides an uncomplicated, accurate, reliable, inexpensive transducer for highly sophisticated control systems.Keywords: electromagnetic sensor, accurately, data acquisition, position measurement
Procedia PDF Downloads 2856115 Hardware Implementation on Field Programmable Gate Array of Two-Stage Algorithm for Rough Set Reduct Generation
Authors: Tomasz Grzes, Maciej Kopczynski, Jaroslaw Stepaniuk
Abstract:
The rough sets theory developed by Prof. Z. Pawlak is one of the tools that can be used in the intelligent systems for data analysis and processing. Banking, medicine, image recognition and security are among the possible fields of utilization. In all these fields, the amount of the collected data is increasing quickly, but with the increase of the data, the computation speed becomes the critical factor. Data reduction is one of the solutions to this problem. Removing the redundancy in the rough sets can be achieved with the reduct. A lot of algorithms of generating the reduct were developed, but most of them are only software implementations, therefore have many limitations. Microprocessor uses the fixed word length, consumes a lot of time for either fetching as well as processing of the instruction and data; consequently, the software based implementations are relatively slow. Hardware systems don’t have these limitations and can process the data faster than a software. Reduct is the subset of the decision attributes that provides the discernibility of the objects. For the given decision table there can be more than one reduct. Core is the set of all indispensable condition attributes. None of its elements can be removed without affecting the classification power of all condition attributes. Moreover, every reduct consists of all the attributes from the core. In this paper, the hardware implementation of the two-stage greedy algorithm to find the one reduct is presented. The decision table is used as an input. Output of the algorithm is the superreduct which is the reduct with some additional removable attributes. First stage of the algorithm is calculating the core using the discernibility matrix. Second stage is generating the superreduct by enriching the core with the most common attributes, i.e., attributes that are more frequent in the decision table. Described above algorithm has two disadvantages: i) generating the superreduct instead of reduct, ii) additional first stage may be unnecessary if the core is empty. But for the systems focused on the fast computation of the reduct the first disadvantage is not the key problem. The core calculation can be achieved with a combinational logic block, and thus add respectively little time to the whole process. Algorithm presented in this paper was implemented in Field Programmable Gate Array (FPGA) as a digital device consisting of blocks that process the data in a single step. Calculating the core is done by the comparators connected to the block called 'singleton detector', which detects if the input word contains only single 'one'. Calculating the number of occurrences of the attribute is performed in the combinational block made up of the cascade of the adders. The superreduct generation process is iterative and thus needs the sequential circuit for controlling the calculations. For the research purpose, the algorithm was also implemented in C language and run on a PC. The times of execution of the reduct calculation in a hardware and software were considered. Results show increase in the speed of data processing.Keywords: data reduction, digital systems design, field programmable gate array (FPGA), reduct, rough set
Procedia PDF Downloads 2196114 Stability of Pump Station Cavern in Chagrin Shale with Time
Authors: Mohammad Moridzadeh, Mohammad Djavid, Barry Doyle
Abstract:
An assessment of the long-term stability of a cavern in Chagrin shale excavated by the sequential excavation method was performed during and after construction. During the excavation of the cavern, deformations of rock mass were measured at the surface of excavation and within the rock mass by surface and deep measurement instruments. Rock deformations were measured during construction which appeared to result from the as-built excavation sequence that had potentially disturbed the rock and its behavior. Also some additional time dependent rock deformations were observed during and post excavation. Several opinions have been expressed to explain this time dependent deformation including stress changes induced by excavation, strain softening (or creep) in the beddings with and without clay and creep of the shaley rock under compressive stresses. In order to analyze and replicate rock behavior observed during excavation, including current and post excavation elastic, plastic, and time dependent deformation, Finite Element Analysis (FEA) was performed. The analysis was also intended to estimate long term deformation of the rock mass around the excavation. Rock mass behavior including time dependent deformation was measured by means of rock surface convergence points, MPBXs, extended creep testing on the long anchors, and load history data from load cells attached to several long anchors. Direct creep testing of Chagrin Shale was performed on core samples from the wall of the Pump Room. Results of these measurements were used to calibrate the FEA of the excavation. These analyses incorporate time dependent constitutive modeling for the rock to evaluate the potential long term movement in the roof, walls, and invert of the cavern. The modeling was performed due to the concerns regarding the unanticipated behavior of the rock mass as well as the forecast of long term deformation and stability of rock around the excavation.Keywords: Cavern, Chagrin shale, creep, finite element.
Procedia PDF Downloads 3526113 Hansen Solubility Parameters, Quality by Design Tool for Developing Green Nanoemulsion to Eliminate Sulfamethoxazole from Contaminated Water
Authors: Afzal Hussain, Mohammad A. Altamimi, Syed Sarim Imam, Mudassar Shahid, Osamah Abdulrahman Alnemer
Abstract:
Exhaustive application of sulfamethoxazole (SUX) became as a global threat for human health due to water contamination through diverse sources. The addressed combined application of Hansen solubility (HSPiP software) parameters and Quality by Design tool for developing various green nanoemulsions. HSPiP program assisted to screen suitable excipients based on Hansen solubility parameters and experimental solubility data. Various green nanoemulsions were prepared and characterized for globular size, size distribution, zeta potential, and removal efficiency. Design Expert (DoE) software further helped to identify critical factors responsible to have direct impact on percent removal efficiency, size, and viscosity. Morphological investigation was visualized under transmission electron microscopy (TEM). Finally, the treated was studied to negate the presence of the tested drug employing ICP-OES (inductively coupled plasma optical emission microscopy) technique and HPLC (high performance liquid chromatography). Results showed that HSPiP predicted biocompatible lipid, safe surfactant (lecithin), and propylene glycol (PG). Experimental solubility of the drug in the predicted excipients were quite convincing and vindicated. Various green nanoemulsions were fabricated, and these were evaluated for in vitro findings. Globular size (100-300 nm), PDI (0.1-0.5), zeta potential (~ 25 mV), and removal efficiency (%RE = 70-98%) were found to be in acceptable range for deciding input factors with level in DoE. Experimental design tool assisted to identify the most critical variables controlling %RE and optimized content of nanoemulsion under set constraints. Dispersion time was varied from 5-30 min. Finally, ICP-OES and HPLC techniques corroborated the absence of SUX in the treated water. Thus, the strategy is simple, economic, selective, and efficient.Keywords: quality by design, sulfamethoxazole, green nanoemulsion, water treatment, icp-oes, hansen program (hspip software
Procedia PDF Downloads 826112 Scientific Linux Cluster for BIG-DATA Analysis (SLBD): A Case of Fayoum University
Authors: Hassan S. Hussein, Rania A. Abul Seoud, Amr M. Refaat
Abstract:
Scientific researchers face in the analysis of very large data sets that is increasing noticeable rate in today’s and tomorrow’s technologies. Hadoop and Spark are types of software that developed frameworks. Hadoop framework is suitable for many Different hardware platforms. In this research, a scientific Linux cluster for Big Data analysis (SLBD) is presented. SLBD runs open source software with large computational capacity and high performance cluster infrastructure. SLBD composed of one cluster contains identical, commodity-grade computers interconnected via a small LAN. SLBD consists of a fast switch and Gigabit-Ethernet card which connect four (nodes). Cloudera Manager is used to configure and manage an Apache Hadoop stack. Hadoop is a framework allows storing and processing big data across the cluster by using MapReduce algorithm. MapReduce algorithm divides the task into smaller tasks which to be assigned to the network nodes. Algorithm then collects the results and form the final result dataset. SLBD clustering system allows fast and efficient processing of large amount of data resulting from different applications. SLBD also provides high performance, high throughput, high availability, expandability and cluster scalability.Keywords: big data platforms, cloudera manager, Hadoop, MapReduce
Procedia PDF Downloads 359