Search results for: Dufour’s number
7702 The Use of Random Set Method in Reliability Analysis of Deep Excavations
Authors: Arefeh Arabaninezhad, Ali Fakher
Abstract:
Since the deterministic analysis methods fail to take system uncertainties into account, probabilistic and non-probabilistic methods are suggested. Geotechnical analyses are used to determine the stress and deformation caused by construction; accordingly, many input variables which depend on ground behavior are required for geotechnical analyses. The Random Set approach is an applicable reliability analysis method when comprehensive sources of information are not available. Using Random Set method, with relatively small number of simulations compared to fully probabilistic methods, smooth extremes on system responses are obtained. Therefore random set approach has been proposed for reliability analysis in geotechnical problems. In the present study, the application of random set method in reliability analysis of deep excavations is investigated through three deep excavation projects which were monitored during the excavating process. A finite element code is utilized for numerical modeling. Two expected ranges, from different sources of information, are established for each input variable, and a specific probability assignment is defined for each range. To determine the most influential input variables and subsequently reducing the number of required finite element calculations, sensitivity analysis is carried out. Input data for finite element model are obtained by combining the upper and lower bounds of the input variables. The relevant probability share of each finite element calculation is determined considering the probability assigned to input variables present in these combinations. Horizontal displacement of the top point of excavation is considered as the main response of the system. The result of reliability analysis for each intended deep excavation is presented by constructing the Belief and Plausibility distribution function (i.e. lower and upper bounds) of system response obtained from deterministic finite element calculations. To evaluate the quality of input variables as well as applied reliability analysis method, the range of displacements extracted from models has been compared to the in situ measurements and good agreement is observed. The comparison also showed that Random Set Finite Element Method applies to estimate the horizontal displacement of the top point of deep excavation. Finally, the probability of failure or unsatisfactory performance of the system is evaluated by comparing the threshold displacement with reliability analysis results.Keywords: deep excavation, random set finite element method, reliability analysis, uncertainty
Procedia PDF Downloads 2687701 The Environmental Impacts of Textiles Reuse and Recycling: A Review on Life-Cycle-Assessment Publications
Authors: Samuele Abagnato, Lucia Rigamonti
Abstract:
Life-Cycle-Assessment (LCA) is an effective tool to quantify the environmental impacts of reuse models and recycling technologies for textiles. In this work, publications in the last ten years about LCA on textile waste are classified according to location, goal and scope, functional unit, waste composition, impact assessment method, impact categories, and sensitivity analysis. Twenty papers have been selected: 50% are focused only on recycling, 30% only on reuse, the 15% on both, while only one paper considers only the final disposal of the waste. It is found that reuse is generally the best way to decrease the environmental impacts of textiles waste management because of the avoided impacts of manufacturing a new item. In the comparison between a product made with recycled yarns and a product from virgin materials, in general, the first option is less impact, especially for the categories of climate change, water depletion, and land occupation, while for other categories, such as eutrophication or ecotoxicity, under certain conditions the impacts of the recycled fibres can be higher. Cultivation seems to have quite high impacts when natural fibres are involved, especially in the land use and water depletion categories, while manufacturing requires a remarkable amount of electricity, with its associated impact on climate change. In the analysis of the reuse processes, relevant importance is covered by the laundry phase, with water consumption and impacts related to the use of detergents. About the sensitivity analysis, it can be stated that one of the main variables that influence the LCA results and that needs to be further investigated in the modeling of the LCA system about this topic is the substitution rate between recycled and virgin fibres, that is the amount of recycled material that can be used in place of virgin one. Related to this, also the yield of the recycling processes has a strong influence on the results of the impact. The substitution rate is also important in the modeling of the reuse processes because it represents the number of avoided new items bought in place of the reused ones. Another aspect that appears to have a large influence on the impacts is consumer behaviour during the use phase (for example, the number of uses between two laundry cycles). In conclusion, to have a deeper knowledge of the impacts of a life-cycle approach of textile waste, further data and research are needed in the modeling of the substitution rate and of the use phase habits of the consumers.Keywords: environmental impacts, life-cycle-assessment, textiles recycling, textiles reuse, textiles waste management
Procedia PDF Downloads 897700 Factors Affecting Cesarean Section among Women in Qatar Using Multiple Indicator Cluster Survey Database
Authors: Sahar Elsaleh, Ghada Farhat, Shaikha Al-Derham, Fasih Alam
Abstract:
Background: Cesarean section (CS) delivery is one of the major concerns both in developing and developed countries. The rate of CS deliveries are on the rise globally, and especially in Qatar. Many socio-economic, demographic, clinical and institutional factors play an important role for cesarean sections. This study aims to investigate factors affecting the prevalence of CS among women in Qatar using the UNICEF’s Multiple Indicator Cluster Survey (MICS) 2012 database. Methods: The study has focused on the women’s questionnaire of the MICS, which was successfully distributed to 5699 participants. Following study inclusion and exclusion criteria, a final sample of 761 women aged 19- 49 years who had at least one delivery of giving birth in their lifetime before the survey were included. A number of socio-economic, demographic, clinical and institutional factors, identified through literature review and available in the data, were considered for the analyses. Bivariate and multivariate logistic regression models, along with a multi-level modeling to investigate clustering effect, were undertaken to identify the factors that affect CS prevalence in Qatar. Results: From the bivariate analyses the study has shown that, a number of categorical factors are statistically significantly associated with the dependent variable (CS). When identifying the factors from a multivariate logistic regression, the study found that only three categorical factors -‘age of women’, ‘place at delivery’ and ‘baby weight’ appeared to be significantly affecting the CS among women in Qatar. Although the MICS dataset is based on a cluster survey, an exploratory multi-level analysis did not show any clustering effect, i.e. no significant variation in results at higher level (households), suggesting that all analyses at lower level (individual respondent) are valid without any significant bias in results. Conclusion: The study found a statistically significant association between the dependent variable (CS delivery) and age of women, frequency of TV watching, assistance at birth and place of birth. These results need to be interpreted cautiously; however, it can be used as evidence-base for further research on cesarean section delivery in Qatar.Keywords: cesarean section, factors, multiple indicator cluster survey, MICS database, Qatar
Procedia PDF Downloads 1167699 Flora of Seaweeds and the Preliminary Screening of the Fungal Endophytes
Authors: Nur Farah Ain Zainee, Ahmad Ismail, Nazlina Ibrahim, Asmida Ismail
Abstract:
Seaweeds are economically important as they have the potential of being utilized, the capabilities and opportunities for further expansion as well as the availability of other species for future development. Hence, research on the diversity and distribution of seaweeds have to be expanded whilst the seaweeds are one of the Malaysian marine valuable heritage. The study on the distribution of seaweeds at Pengerang, Johor was carried out between February and November 2015 at Kampung Jawa Darat and Kampung Sungai Buntu. The study sites are located at the south-southeast of Peninsular Malaysia where the Petronas Refinery and Petrochemicals Integrated Project Development (RAPID) are in progress. In future, the richness of seaweeds in Pengerang will vanish soon due to the loss of habitat prior to RAPID project. The research was completed to study the diversity of seaweed and to determine the present of fungal endophyte isolated from the seaweed. The sample was calculated by using quadrat with 25-meter line transect by 3 replication for each site. The specimen were preserved, identified, processed in the laboratory and kept as herbarium specimen in Algae Herbarium, Universiti Kebangsaan Malaysia. The complete thallus specimens for fungal endophyte screening were chosen meticulously, transferred into sterile zip-lock plastic bag and kept in the freezer for further process. A total of 29 species has been identified including 12 species of Chlorophyta, 2 species of Phaeophyta and 14 species of Rhodophyta. From February to November 2015, the number of species highly varied and there was a significant change in community structure of seaweeds. Kampung Sungai Buntu shows the highest diversity throughout the study compared to Kampung Jawa Darat. This evidence can be related to the high habitat preference such as types of shores which is rocky, sandy and having lagoon and bay. These can enhance the existence of the seaweeds community due to variations of the habitat. Eighteen seaweed species were selected and screened for the capability presence of fungal endophyte; Sargassum polycystum marked having the highest number of fungal endophyte compared to the other species. These evidence has proved the seaweed have capable of accommodating a lot of species of fungal endophytes. Thus, these evidence leads to positive consequences where further research should be employed.Keywords: diversity, fungal endophyte, macroalgae, screening, seaweed
Procedia PDF Downloads 2297698 Underwater Image Enhancement and Reconstruction Using CNN and the MultiUNet Model
Authors: Snehal G. Teli, R. J. Shelke
Abstract:
CNN and MultiUNet models are the framework for the proposed method for enhancing and reconstructing underwater images. Multiscale merging of features and regeneration are both performed by the MultiUNet. CNN collects relevant features. Extensive tests on benchmark datasets show that the proposed strategy performs better than the latest methods. As a result of this work, underwater images can be represented and interpreted in a number of underwater applications with greater clarity. This strategy will advance underwater exploration and marine research by enhancing real-time underwater image processing systems, underwater robotic vision, and underwater surveillance.Keywords: convolutional neural network, image enhancement, machine learning, multiunet, underwater images
Procedia PDF Downloads 757697 Triple Diffusive Convection in a Vertically Oscillating Oldroyd-B Liquid
Authors: Sameena Tarannum, S. Pranesh
Abstract:
The effect of linear stability analysis of triple diffusive convection in a vertically oscillating viscoelastic liquid of Oldroyd-B type is studied. The correction Rayleigh number is obtained by using perturbation method which gives prospect to control the convection. The eigenvalue is obtained by using perturbation method by adopting Venezian approach. From the study, it is observed that gravity modulation advances the onset of triple diffusive convection.Keywords: gravity modulation, Oldroyd-b liquid, triple diffusive convection, venezian approach
Procedia PDF Downloads 1767696 On Consolidated Predictive Model of the Natural History of Breast Cancer Considering Primary Tumor and Primary Distant Metastases Growth
Authors: Ella Tyuryumina, Alexey Neznanov
Abstract:
Finding algorithms to predict the growth of tumors has piqued the interest of researchers ever since the early days of cancer research. A number of studies were carried out as an attempt to obtain reliable data on the natural history of breast cancer growth. Mathematical modeling can play a very important role in the prognosis of tumor process of breast cancer. However, mathematical models describe primary tumor growth and metastases growth separately. Consequently, we propose a mathematical growth model for primary tumor and primary metastases which may help to improve predicting accuracy of breast cancer progression using an original mathematical model referred to CoM-IV and corresponding software. We are interested in: 1) modelling the whole natural history of primary tumor and primary metastases; 2) developing adequate and precise CoM-IV which reflects relations between PT and MTS; 3) analyzing the CoM-IV scope of application; 4) implementing the model as a software tool. The CoM-IV is based on exponential tumor growth model and consists of a system of determinate nonlinear and linear equations; corresponds to TNM classification. It allows to calculate different growth periods of primary tumor and primary metastases: 1) ‘non-visible period’ for primary tumor; 2) ‘non-visible period’ for primary metastases; 3) ‘visible period’ for primary metastases. The new predictive tool: 1) is a solid foundation to develop future studies of breast cancer models; 2) does not require any expensive diagnostic tests; 3) is the first predictor which makes forecast using only current patient data, the others are based on the additional statistical data. Thus, the CoM-IV model and predictive software: a) detect different growth periods of primary tumor and primary metastases; b) make forecast of the period of primary metastases appearance; c) have higher average prediction accuracy than the other tools; d) can improve forecasts on survival of BC and facilitate optimization of diagnostic tests. The following are calculated by CoM-IV: the number of doublings for ‘nonvisible’ and ‘visible’ growth period of primary metastases; tumor volume doubling time (days) for ‘nonvisible’ and ‘visible’ growth period of primary metastases. The CoM-IV enables, for the first time, to predict the whole natural history of primary tumor and primary metastases growth on each stage (pT1, pT2, pT3, pT4) relying only on primary tumor sizes. Summarizing: a) CoM-IV describes correctly primary tumor and primary distant metastases growth of IV (T1-4N0-3M1) stage with (N1-3) or without regional metastases in lymph nodes (N0); b) facilitates the understanding of the appearance period and manifestation of primary metastases.Keywords: breast cancer, exponential growth model, mathematical modelling, primary metastases, primary tumor, survival
Procedia PDF Downloads 3357695 Controlling Shape and Position of Silicon Micro-nanorolls Fabricated using Fine Bubbles during Anodization
Authors: Yodai Ashikubo, Toshiaki Suzuki, Satoshi Kouya, Mitsuya Motohashi
Abstract:
Functional microstructures such as wires, fins, needles, and rolls are currently being applied to variety of high-performance devices. Under these conditions, a roll structure (silicon micro-nanoroll) was formed on the surface of the silicon substrate via fine bubbles during anodization using an extremely diluted hydrofluoric acid (HF + H₂O). The as-formed roll had a microscale length and width of approximately 1 µm. The number of rolls was 3-10 times and the thickness of the film forming the rolls was about 10 nm. Thus, it is promising for applications as a distinct device material. These rolls functioned as capsules and/or pipelines. To date, number of rolls and roll length have been controlled by anodization conditions. In general, controlling the position and roll winding state is required for device applications. However, it has not been discussed. Grooves formed on silicon surface before anodization might be useful control the bubbles. In this study, we investigated the effect of the grooves on the position and shape of the roll. The surfaces of the silicon wafers were anodized. The starting material was p-type (100) single-crystalline silicon wafers. The resistivity of the wafer is 5-20 ∙ cm. Grooves were formed on the surface of the substrate before anodization using sandpaper and diamond pen. The average width and depth of the grooves were approximately 1 µm and 0.1 µm, respectively. The HF concentration {HF/ (HF + C₂H5OH + H₂O)} was 0.001 % by volume. The C2H5OH concentration {C₂H5OH/ (HF + C₂H5OH + H₂O)} was 70 %. A vertical single-tank cell and Pt cathode were used for anodization. The silicon roll was observed by field-emission scanning electron microscopy (FE-SEM; JSM-7100, JEOL). The atomic bonding state of the rolls was evaluated using X-ray photoelectron spectroscopy (XPS; ESCA-3400, Shimadzu). For straight groove, the rolls were formed along the groove. This indicates that the orientation of the rolls can be controlled by the grooves. For lattice-like groove, the rolls formed inside the lattice and along the long sides. In other words, the aspect ratio of the lattice is very important for the roll formation. In addition, many rolls were formed and winding states were not uniform when the lattice size is too large. On the other hand, no rolls were formed for small lattice. These results indicate that there is the optimal size of lattice for roll formation. In the future, we are planning on formation of rolls using groove formed by lithography technique instead of sandpaper and the pen. Furthermore, the rolls included nanoparticles will be formed for nanodevices.Keywords: silicon roll, anodization, fine bubble, microstructure
Procedia PDF Downloads 187694 Surge in U. S. Citizens Expatriation: Testing Structual Equation Modeling to Explain the Underlying Policy Rational
Authors: Marco Sewald
Abstract:
Comparing present to past the numbers of Americans expatriating U. S. citizenship have risen. Even though these numbers are small compared to the immigrants, U. S. citizens expatriations have historically been much lower, making the uptick worrisome. In addition, the published lists and numbers from the U.S. government seems incomplete, with many not counted. Different branches of the U. S. government report different numbers and no one seems to know exactly how big the real number is, even though the IRS and the FBI both track and/or publish numbers of Americans who renounce. Since there is no single explanation, anecdotal evidence suggests this uptick is caused by global tax law and increased compliance burdens imposed by the U.S. lawmakers on U.S. citizens abroad. Within a research project the question arose about the reasons why a constant growing number of U.S. citizens are expatriating – the answers are believed helping to explain the underlying governmental policy rational, leading to such activities. While it is impossible to locate former U.S. citizens to conduct a survey on the reasons and the U.S. government is not commenting on the reasons given within the process of expatriation, the chosen methodology is Structural Equation Modeling (SEM), in the first step by re-using current surveys conducted by different researchers within the population of U. S. citizens residing abroad during the last years. Surveys questioning the personal situation in the context of tax, compliance, citizenship and likelihood to repatriate to the U. S. In general SEM allows: (1) Representing, estimating and validating a theoretical model with linear (unidirectional or not) relationships. (2) Modeling causal relationships between multiple predictors (exogenous) and multiple dependent variables (endogenous). (3) Including unobservable latent variables. (4) Modeling measurement error: the degree to which observable variables describe latent variables. Moreover SEM seems very appealing since the results can be represented either by matrix equations or graphically. Results: the observed variables (items) of the construct are caused by various latent variables. The given surveys delivered a high correlation and it is therefore impossible to identify the distinct effect of each indicator on the latent variable – which was one desired result. Since every SEM comprises two parts: (1) measurement model (outer model) and (2) structural model (inner model), it seems necessary to extend the given data by conducting additional research and surveys to validate the outer model to gain the desired results.Keywords: expatriation of U. S. citizens, SEM, structural equation modeling, validating
Procedia PDF Downloads 2217693 Second Time’s a Charm: The Intervention of the European Patent Office on the Strategic Use of Divisional Applications
Authors: Alissa Lefebre
Abstract:
It might seem intuitive to hope for a fast decision on the patent grant. After all, a granted patent provides you with a monopoly position, which allows you to obstruct others from using your technology. However, this does not take into account the strategic advantages one can obtain from keeping their patent applications pending. First, you have the financial advantage of postponing certain fees, although many applicants would probably agree that this is not the main benefit. As the scope of the patent protection is only decided upon at the grant, the pendency period introduces uncertainty amongst rivals. This uncertainty entails not knowing whether the patent will actually get granted and what the scope of protection will be. Consequently, rivals can only depend upon limited and uncertain information when deciding what technology is worth pursuing. One way to keep patent applications pending, is the use of divisional applications. These applicants can be filed out of a parent application as long as that parent application is still pending. This allows the applicant to pursue (part of) the content of the parent application in another application, as the divisional application cannot exceed the scope of the parent application. In a fast-moving and complex market such as the tele- and digital communications, it might allow applicants to obtain an actual monopoly position as competitors are discouraged to pursue a certain technology. Nevertheless, this practice also has downsides to it. First of all, it has an impact on the workload of the examiners at the patent office. As the number of patent filings have been increasing over the last decades, using strategies that increase this number even more, is not desirable from the patent examiners point of view. Secondly, a pending patent does not provide you with the protection of a granted patent, thus not only create uncertainty for the rivals, but also for the applicant. Consequently, the European patent office (EPO) has come up with a “raising the bar initiative” in which they have decided to tackle the strategic use of divisional applications. Over the past years, two rules have been implemented. The first rule in 2010 introduced a time limit, upon which divisional applications could only be filed within a 24-month limit after the first communication with the patent office. However, after carrying-out a user feedback survey, the EPO abolished the rule again in 2014 and replaced it by a fee mechanism. The fee mechanism is still in place today, which might be an indication of a better result compared to the first rule change. This study tests the impact of these rules on the strategic use of divisional applications in the tele- and digital communication industry and provides empirical evidence on their success. Upon using three different survival models, we find overall evidence that divisional applications prolong the pendency time and that only the second rule is able to tackle the strategic patenting and thus decrease the pendency time.Keywords: divisional applications, regulatory changes, strategic patenting, EPO
Procedia PDF Downloads 1287692 A New Scheme for Chain Code Normalization in Arabic and Farsi Scripts
Authors: Reza Shakoori
Abstract:
This paper presents a structural correction of Arabic and Persian strokes using manipulation of their chain codes in order to improve the rate and performance of Persian and Arabic handwritten word recognition systems. It collects pure and effective features to represent a character with one consolidated feature vector and reduces variations in order to decrease the number of training samples and increase the chance of successful classification. Our results also show that how the proposed approaches can simplify classification and consequently recognition by reducing variations and possible noises on the chain code by keeping orientation of characters and their backbone structures.Keywords: Arabic, chain code normalization, OCR systems, image processing
Procedia PDF Downloads 4047691 Half-Circle Fuzzy Number Threshold Determination via Swarm Intelligence Method
Authors: P. W. Tsai, J. W. Chen, C. W. Chen, C. Y. Chen
Abstract:
In recent years, many researchers are involved in the field of fuzzy theory. However, there are still a lot of issues to be resolved. Especially on topics related to controller design such as the field of robot, artificial intelligence, and nonlinear systems etc. Besides fuzzy theory, algorithms in swarm intelligence are also a popular field for the researchers. In this paper, a concept of utilizing one of the swarm intelligence method, which is called Bacterial-GA Foraging, to find the stabilized common P matrix for the fuzzy controller system is proposed. An example is given in in the paper, as well.Keywords: half-circle fuzzy numbers, predictions, swarm intelligence, Lyapunov method
Procedia PDF Downloads 6857690 Nullity of t-Tupple Graphs
Authors: Khidir R. Sharaf, Didar A. Ali
Abstract:
The nullity η (G) of a graph is the occurrence of zero as an eigenvalue in its spectra. A zero-sum weighting of a graph G is real valued function, say f from vertices of G to the set of real numbers, provided that for each vertex of G the summation of the weights f (w) over all neighborhood w of v is zero for each v in G.A high zero-sum weighting of G is one that uses maximum number of non-zero independent variables. If G is graph with an end vertex, and if H is an induced sub-graph of G obtained by deleting this vertex together with the vertex adjacent to it, then, η(G)= η(H). In this paper, a high zero-sum weighting technique and the end vertex procedure are applied to evaluate the nullity of t-tupple and generalized t-tupple graphs are derived and determined for some special types of graphs. Also, we introduce and prove some important results about the t-tupple coalescence, Cartesian and Kronecker products of nut graphs.Keywords: graph theory, graph spectra, nullity of graphs, statistic
Procedia PDF Downloads 2397689 Adaptive Nonparametric Approach for Guaranteed Real-Time Detection of Targeted Signals in Multichannel Monitoring Systems
Authors: Andrey V. Timofeev
Abstract:
An adaptive nonparametric method is proposed for stable real-time detection of seismoacoustic sources in multichannel C-OTDR systems with a significant number of channels. This method guarantees given upper boundaries for probabilities of Type I and Type II errors. Properties of the proposed method are rigorously proved. The results of practical applications of the proposed method in a real C-OTDR-system are presented in this report.Keywords: guaranteed detection, multichannel monitoring systems, change point, interval estimation, adaptive detection
Procedia PDF Downloads 4477688 Cooperative CDD scheme Based on Adaptive Modulation in Wireless Communiation System
Authors: Seung-Jun Yu, Hwan-Jun Choi, Hyoung-Kyu Song
Abstract:
Among spatial diversity scheme, orthogonal space-time block code (OSTBC) and cyclic delay diversity (CDD) have been widely studied for the cooperative wireless relaying system. However, conventional OSTBC and CDD cannot cope with change in the number of relays owing to low throughput or error performance. In this paper, we propose a cooperative cyclic delay diversity (CDD) scheme that use hierarchical modulation at the source and adaptive modulation based on cyclic redundancy check (CRC) code at the relays.Keywords: adaptive modulation, cooperative communication, CDD, OSTBC
Procedia PDF Downloads 4317687 Choice Analysis of Ground Access to São Paulo/Guarulhos International Airport Using Adaptive Choice-Based Conjoint Analysis (ACBC)
Authors: Carolina Silva Ansélmo
Abstract:
Airports are demand-generating poles that affect the flow of traffic around them. The airport access system must be fast, convenient, and adequately planned, considering its potential users. An airport with good ground access conditions can provide the user with a more satisfactory access experience. When several transport options are available, service providers must understand users' preferences and the expected quality of service. The present study focuses on airport access in a comparative scenario between bus, private vehicle, subway, taxi and urban mobility transport applications to São Paulo/Guarulhos International Airport. The objectives are (i) to identify the factors that influence the choice, (ii) to measure Willingness to Pay (WTP), and (iii) to estimate the market share for each modal. The applied method was Adaptive Choice-based Conjoint Analysis (ACBC) technique using Sawtooth Software. Conjoint analysis, rooted in Utility Theory, is a survey technique that quantifies the customer's perceived utility when choosing alternatives. Assessing user preferences provides insights into their priorities for product or service attributes. An additional advantage of conjoint analysis is its requirement for a smaller sample size compared to other methods. Furthermore, ACBC provides valuable insights into consumers' preferences, willingness to pay, and market dynamics, aiding strategic decision-making to provide a better customer experience, pricing, and market segmentation. In the present research, the ACBC questionnaire had the following variables: (i) access time to the boarding point, (ii) comfort in the vehicle, (iii) number of travelers together, (iv) price, (v) supply power, and (vi) type of vehicle. The case study questionnaire reached 213 valid responses considering the scenario of access from the São Paulo city center to São Paulo/Guarulhos International Airport. As a result, the price and the number of travelers are the most relevant attributes for the sample when choosing airport access. The market share of the selection is mainly urban mobility transport applications, followed by buses, private vehicles, taxis and subways.Keywords: adaptive choice-based conjoint analysis, ground access to airport, market share, willingness to pay
Procedia PDF Downloads 787686 Nano-Pesticides: Recent Emerging Tool for Sustainable Agricultural Practices
Authors: Ekta, G. K. Darbha
Abstract:
Nanotechnology offers the potential of simultaneously increasing efficiency as compared to their bulk material as well as reducing harmful environmental impacts of pesticides in field of agriculture. The term nanopesticide covers different pesticides that are cumulative of several surfactants, polymers, metal ions, etc. of nanometer size ranges from 1-1000 nm and exhibit abnormal behavior (high efficacy and high specific surface area) of nanomaterials. Commercial formulations of pesticides used by farmers nowadays cannot be used effectively due to a number of problems associated with them. For example, more than 90% of applied formulations are either lost in the environment or unable to reach the target area required for effective pest control. Around 20−30% of pesticides are lost through emissions. A number of factors (application methods, physicochemical properties of the formulations, and environmental conditions) can influence the extent of loss during application. It is known that among various formulations, polymer-based formulations show the greatest potential due to their greater efficacy, slow release and protection against premature degradation of active ingredient as compared to other commercial formulations. However, the nanoformulations can have a significant effect on the fate of active ingredient as well as may release some new ingredients by reacting with existing soil contaminants. Environmental fate of these newly generated species is still not explored very well which is essential to field scale experiments and hence a lot to be explored in the field of environmental fate, nanotoxicology, transport properties and stability of such formulations. In our preliminary work, we have synthesized polymer based nanoformulation of commercially used weedicide atrazine. Atrazine belongs to triazine class of herbicide, which is used in the effective control of seed germinated dicot weeds and grasses. It functions by binding to the plastoquinone-binding protein in PS-II. Plant death results from starvation and oxidative damage caused by breakdown in electron transport system. The stability of the suspension of nanoformulation containing herbicide has been evaluated by considering different parameters like polydispersity index, particle diameter, zeta-potential under different environmental relevance condition such as pH range 4-10, temperature range from 25°C to 65°C and stability of encapsulation also have been studied for different amount of added polymer. Morphological characterization has been done by using SEM.Keywords: atrazine, nanoformulation, nanopesticide, nanotoxicology
Procedia PDF Downloads 2567685 Optimization of Dez Dam Reservoir Operation Using Genetic Algorithm
Authors: Alireza Nikbakht Shahbazi, Emadeddin Shirali
Abstract:
Since optimization issues of water resources are complicated due to the variety of decision making criteria and objective functions, it is sometimes impossible to resolve them through regular optimization methods or, it is time or money consuming. Therefore, the use of modern tools and methods is inevitable in resolving such problems. An accurate and essential utilization policy has to be determined in order to use natural resources such as water reservoirs optimally. Water reservoir programming studies aim to determine the final cultivated land area based on predefined agricultural models and water requirements. Dam utilization rule curve is also provided in such studies. The basic information applied in water reservoir programming studies generally include meteorological, hydrological, agricultural and water reservoir related data, and the geometric characteristics of the reservoir. The system of Dez dam water resources was simulated applying the basic information in order to determine the capability of its reservoir to provide the objectives of the performed plan. As a meta-exploratory method, genetic algorithm was applied in order to provide utilization rule curves (intersecting the reservoir volume). MATLAB software was used in order to resolve the foresaid model. Rule curves were firstly obtained through genetic algorithm. Then the significance of using rule curves and the decrease in decision making variables in the system was determined through system simulation and comparing the results with optimization results (Standard Operating Procedure). One of the most essential issues in optimization of a complicated water resource system is the increasing number of variables. Therefore a lot of time is required to find an optimum answer and in some cases, no desirable result is obtained. In this research, intersecting the reservoir volume has been applied as a modern model in order to reduce the number of variables. Water reservoir programming studies has been performed based on basic information, general hypotheses and standards and applying monthly simulation technique for a statistical period of 30 years. Results indicated that application of rule curve prevents the extreme shortages and decrease the monthly shortages.Keywords: optimization, rule curve, genetic algorithm method, Dez dam reservoir
Procedia PDF Downloads 2657684 A Method of Improving Out Put Using a Feedback Supply Chain System: Case Study Bramlima
Authors: Samuel Atongaba Danji, Veseke Moleke
Abstract:
The increase of globalization is a very important part of today’s changing environment and due to this, manufacturing industries have to always come up with methods of continuous improvement of their manufacturing methods in order to be competitive, without which may lead them to be left out of the market due to constant changing customers requirement. Due to this, the need is an advance supply chain system which prevents a number of issues that can prevent a company from being competitive. In this work, we developed a feedback control supply chain system which streamline the entire process in order to improve competitiveness and the result shows that when applied in a different geographical area, the output varies.Keywords: globalization, supply chain, improvement, manufacturing
Procedia PDF Downloads 3307683 Global Analysis of HIV Virus Models with Cell-to-Cell
Authors: Hossein Pourbashash
Abstract:
Recent experimental studies have shown that HIV can be transmitted directly from cell to cell when structures called virological synapses form during interactions between T cells. In this article, we describe a new within-host model of HIV infection that incorporates two mechanisms: infection by free virions and the direct cell-to-cell transmission. We conduct the local and global stability analysis of the model. We show that if the basic reproduction number R0 1, the virus is cleared and the disease dies out; if R0 > 1, the virus persists in the host. We also prove that the unique positive equilibrium attracts all positive solutions under additional assumptions on the parameters.Keywords: HIV virus model, cell-to-cell transmission, global stability, Lyapunov function, second compound matrices
Procedia PDF Downloads 5177682 The Flora of Bozdağ, Sizma–Konya, Turkey and Its Environs
Authors: Esra Ipekci, Murad Aydin Sanda
Abstract:
The flora of Bozdağ (Konya) and its surroundings were investigated between 2003 and 2005 years; 700 herbarium specimens belonging to 482 taxa, 257 genera and 62 families were collected and identified from the area. The families which have the most taxa in research area are Asteraceae 67 (14.0%), Fabaceae 60 (12.6%), Lamiaceae 57 (11.9%), Brassicaceae 34 (7.1%), Poaceae 30 (6.3%), Rosaceae 24 (5.0%), Caryophyllaceae 23 (4.8%), Liliaceae 19 (4.0%), Boraginaceae 17 (3.6%), Apiaceae 13 (2.7%). The research area is in the district of Konya and is in the B4 square according to the Grid System. The phytogeographic elements are represented in the study area as follows; Mediterranean 72 (14.9%), Irano-Turanian 91 (18.9%), Euro-Siberian 21 (4.3%). The phytogeographic regions of 273 (56.6%) taxa are either multi regional or unknown. The number of endemic taxa is 79 (16.3%).Keywords: Sizma, Bozdağ, Flora, Konya, Turkey
Procedia PDF Downloads 5557681 Using a Phenomenological Approach to Explore the Experiences of Nursing Students in Coping with Their Emotional Responses in Caring for End-Of-Life Patients
Authors: Yun Chan Lee
Abstract:
Background: End-of-life care is a large area of all nursing practice and student nurses are likely to meet dying patients in many placement areas. It is therefore important to understand the emotional responses and coping strategies of student nurses in order for nursing education systems to have some appreciation of how nursing students might be supported in the future. Methodology: This research used a qualitative phenomenological approach. Six student nurses understanding a degree-level adult nursing course were interviewed. Their responses to questions were analyzed using interpretative phenomenological analysis. Finding: The findings identified 3 main themes. First, the common experience of ‘unpreparedness’. A very small number of participants felt that this was unavoidable and that ‘no preparation is possible’, the majority felt that they were unprepared because of ‘insufficient input’ from the university and as a result of wider ‘social taboos’ around death and dying. The second theme showed that emotions were affected by ‘the personal connection to the patient’ and the important sub-themes of ‘the evoking of memories’, ‘involvement in care’ and ‘sense of responsibility’. The third theme, the coping strategies used by students, seemed to fall into two broad areas those ‘internal’ with the student and those ‘external’. In terms of the internal coping strategies, ‘detachment’, ‘faith’, ‘rationalization’ and ‘reflective skills’ are the important components of this part. Regarding the external coping strategies, ‘clinical staff’ and ‘the importance of family and friends’ are the importance of accessing external forms of support. Implication: It is clear that student nurses are affected emotionally by caring for dying patients and many of them have apprehension even before they begin on their placements but very often this is unspoken. Those anxieties before the placement become more pronounced during and continue after the placements. This has implications for when support is offered and possibly its duration. Another significant point of the study is that participants often highlighted their wish to speak to qualified nurses after their experiences of being involved in end-of-life care and especially when they had been present at the time of death. Many of the students spoke that qualified nurses were not available to them. This seemed to be due to a number of reasons. Because the qualified nurses were not available, students had to make use of family members and friends to talk to. Consequently, the implication of this study is not only to educate student nurses but also to educate the qualified mentors on the importance of providing emotional support to students.Keywords: nursing students, coping strategies, end-of-life care, emotional responses
Procedia PDF Downloads 1627680 Factors Influencing University Student's Acceptance of New Technology
Authors: Fatma Khadra
Abstract:
The objective of this research is to identify the acceptance of new technology in a sample of 150 Participants from Qatar University. Based on the Technology Acceptance Model (TAM), we used the Davis’s scale (1989) which contains two item scales for Perceived Usefulness and Perceived Ease of Use. The TAM represents an important theoretical contribution toward understanding how users come to accept and use technology. This model suggests that when people are presented with a new technology, a number of variables influence their decision about how and when they will use it. The results showed that participants accept more technology because flexibility, clarity, enhancing the experience, enjoying, facility, and useful. Also, results showed that younger participants accept more technology than others.Keywords: new technology, perceived usefulness, perceived ease of use, technology acceptance model
Procedia PDF Downloads 3217679 A Comparative Study of Three Major Performance Testing Tools
Authors: Abdulaziz Omar Alsadhan, Mohd Mudasir Shafi
Abstract:
Performance testing is done to prove the reliability of any software product. There are a number of tools available in the markets that are used to perform performance testing. In this paper we present a comparative study of the three most commonly used performance testing tools. These tools cover the major share of the performance testing market and are widely used. In this paper we compared the tools on five evaluation parameters which are; User friendliness, portability, tool support, compatibility and cost. The conclusion provided at the end of the paper is based on our study and does not support any tool or company.Keywords: software development, software testing, quality assurance, performance testing, load runner, rational testing, silk performer
Procedia PDF Downloads 6087678 Evaluating Radiation Dose for Interventional Radiologists Performing Spine Procedures
Authors: Kholood A. Baron
Abstract:
While radiologist numbers specialized in spine interventional procedures are limited in Kuwait, the number of patients demanding these procedures is increasing rapidly. Due to this high demand, the workload of radiologists is increasing, which might represent a radiation exposure concern. During these procedures, the doctor’s hands are in very close proximity to the main radiation beam/ if not within it. The aim of this study is to measure the radiation dose for radiologists during several interventional procedures for the spine. Methods: Two doctors carrying different workloads were included. (DR1) was performing procedures in the morning and afternoon shifts, while (DR2) was performing procedures in the morning shift only. Comparing the radiation exposures that the hand of each doctor is receiving will assess radiation safety and help to set up workload regulations for radiologists carrying a heavy schedule of such procedures. Entrance Skin Dose (ESD) was measured via TLD (ThermoLuminescent Dosimetry) placed at the right wrist of the radiologists. DR1 was covering the morning shift in one hospital (Mubarak Al-Kabeer Hospital) and the afternoon shift in another hospital (Dar Alshifa Hospital). The TLD chip was placed in his gloves during the 2 shifts for a whole week. Since DR2 was covering the morning shift only in Al Razi Hospital, he wore the TLD during the morning shift for a week. It is worth mentioning that DR1 was performing 4-5 spine procedures/day in the morning and the same number in the afternoon and DR2 was performing 5-7 procedures/day. This procedure was repeated for 4 consecutive weeks in order to calculate the ESD value that a hand receives in a month. Results: In general, radiation doses that the hand received in a week ranged from 0.12 to 1.12 mSv. The ESD values for DR1 for the four consecutive weeks were 1.12, 0.32, 0.83, 0.22 mSv, thus for a month (4 weeks), this equals 2.49 mSv and calculated to be 27.39 per year (11 months-since each radiologist have 45 days of leave in each year). For DR2, the weekly ESD values are 0.43, 0.74, 0.12, 0.61 mSv, and thus, for a month, this equals 1.9 mSv, and for a year, this equals 20.9 mSv /year. These values are below the standard level and way below the maximum limit of 500 mSv per year (set by ICRP = International Council of Radiation Protection). However, it is worth mentioning that DR1 was a senior consultant and hence needed less fluoro-time during each procedure. This is evident from the low ESD values of the second week (0.32) and the fourth week (0.22), even though he was performing nearly 10-12 procedures in a day /5 days a week. These values were lower or in the same range as those for DR2 (who was a junior consultant). This highlighted the importance of increasing the radiologist's skills and awareness of fluoroscopy time effect. In conclusion, the radiation dose that radiologists received during spine interventional radiology in our setting was below standard dose limits.Keywords: radiation protection, interventional radiology dosimetry, ESD measurements, radiologist radiation exposure
Procedia PDF Downloads 587677 Experimental Study on Temperature Splitting of a Counter-Flow Ranque-Hilsch Vortex Tube
Authors: Hany. A. Mohamed, M. Attalla, M. Salem, Hussein M. Mghrabie, E. Specht
Abstract:
An experiment al investigation is made to determine the effects of the nozzle dimensions and the inlet pressure on the heating and cooling performance of the counter flow Ranque–Hilsch vortex tube when air used as a working fluid. The all results were taking under inlet pressures were adjusted from 200 kPa to 600 kPa with 100 kPa increments. The conventional tangential generator with number of nuzzle of 6 was used and inner diameter of 7.5 mm. During the experiments, a vortex tube is used with an L/D ratio varied from 10 to 30. Finally, it is observed that the effect of the nuzzle aspect ratio on the energy separation changes according to the value of L/D.Keywords: Ranque-Hilsch, vortex tube, aspect ratio, energy separation
Procedia PDF Downloads 5237676 Combination of Topology and Rough Set for Analysis of Power System Control
Authors: M. Kamel El-Sayed
Abstract:
In this research, we have linked the concept of rough set and topological structure to the creation of a new topological structure that assists in the analysis of the information systems of some electrical engineering issues. We used non-specific information whose boundaries do not have an empty set in the top topological structure is rough set. It is characterized by the fact that it does not contain a large number of elements and facilitates the establishment of rules. We used this structure in reducing the specifications of electrical information systems. We have provided a detailed example of this method illustrating the steps used. This method opens the door to obtaining multiple topologies, each of which uses one of the non-defined groups (rough set) in the overall information system.Keywords: electrical engineering, information system, rough set, rough topology, topology
Procedia PDF Downloads 4547675 Unlocking Health Insights: Studying Data for Better Care
Authors: Valentina Marutyan
Abstract:
Healthcare data mining is a rapidly developing field at the intersection of technology and medicine that has the potential to change our understanding and approach to providing healthcare. Healthcare and data mining is the process of examining huge amounts of data to extract useful information that can be applied in order to improve patient care, treatment effectiveness, and overall healthcare delivery. This field looks for patterns, trends, and correlations in a variety of healthcare datasets, such as electronic health records (EHRs), medical imaging, patient demographics, and treatment histories. To accomplish this, it uses advanced analytical approaches. Predictive analysis using historical patient data is a major area of interest in healthcare data mining. This enables doctors to get involved early to prevent problems or improve results for patients. It also assists in early disease detection and customized treatment planning for every person. Doctors can customize a patient's care by looking at their medical history, genetic profile, current and previous therapies. In this way, treatments can be more effective and have fewer negative consequences. Moreover, helping patients, it improves the efficiency of hospitals. It helps them determine the number of beds or doctors they require in regard to the number of patients they expect. In this project are used models like logistic regression, random forests, and neural networks for predicting diseases and analyzing medical images. Patients were helped by algorithms such as k-means, and connections between treatments and patient responses were identified by association rule mining. Time series techniques helped in resource management by predicting patient admissions. These methods improved healthcare decision-making and personalized treatment. Also, healthcare data mining must deal with difficulties such as bad data quality, privacy challenges, managing large and complicated datasets, ensuring the reliability of models, managing biases, limited data sharing, and regulatory compliance. Finally, secret code of data mining in healthcare helps medical professionals and hospitals make better decisions, treat patients more efficiently, and work more efficiently. It ultimately comes down to using data to improve treatment, make better choices, and simplify hospital operations for all patients.Keywords: data mining, healthcare, big data, large amounts of data
Procedia PDF Downloads 767674 Challenges influencing Nurse Initiated Management of Retroviral Therapy (NIMART) Implementation in Ngaka Modiri Molema District, North West Province, South Africa
Authors: Sheillah Hlamalani Mboweni, Lufuno Makhado
Abstract:
Background: The increasing number of people who tested HIV positive and who demand antiretroviral therapy (ART) prompted the National Department of Health to adopt WHO recommendations of task shifting where Professional Nurses(PNs) initiate ART rather than doctors in the hospital. This resulted in the decentralization of services to primary health care(PHC), generating a need to capacitate PNs on NIMART. After years of training, the impact of NIMART was assessed where it was established that even though there was an increased number who accessed ART, the quality of care is of serious concern. The study aims to answer the following question: What are the challenges influencing NIMART implementation in primary health care. Objectives: This study explores challenges influencing NIMART training and implementation and makes recommendations to improve patient and HIV program outcomes. Methods: A qualitative explorative program evaluation research design. The study was conducted in the rural districts of North West province. Purposive sampling was used to sample PNs trained on NIMART. FGDs were used to collect data with 6-9 participants and data was analysed using ATLAS ti. Results: Five FGDs, n=28 PNs and three program managers were interviewed. The study results revealed two themes: inadequacy in NIMART training and the health care system challenges. Conclusion: The deficiency in NIMART training and health care system challenges is a public health concern as it compromises the quality of HIV management resulting in poor patients’ outcomes and retard the goal of ending the HIV epidemic. These should be dealt with decisively by all stakeholders. Recommendations: The national department of health should improve NIMART training and HIV management: standardization of NIMART training curriculum through the involvement of all relevant stakeholders skilled facilitators, the introduction of pre-service NIMART training in institutions of higher learning, support of PNs by district and program managers, plan on how to deal with the shortage of staff, negative attitude to ensure compliance to guidelines. There is a need to develop a conceptual framework that provides guidance and strengthens NIMART implementation in PHC facilities.Keywords: antiretroviral therapy, nurse initiated management of retroviral therapy, primary health care, professional nurses
Procedia PDF Downloads 1587673 Using the Semantic Web Technologies to Bring Adaptability in E-Learning Systems
Authors: Fatima Faiza Ahmed, Syed Farrukh Hussain
Abstract:
The last few decades have seen a large proportion of our population bending towards e-learning technologies, starting from learning tools used in primary and elementary schools to competency based e-learning systems specifically designed for applications like finance and marketing. The huge diversity in this crowd brings about a large number of challenges for the designers of these e-learning systems, one of which is the adaptability of such systems. This paper focuses on adaptability in the learning material in an e-learning course and how artificial intelligence and the semantic web can be used as an effective tool for this purpose. The study proved that the semantic web, still a hot topic in the area of computer science can prove to be a powerful tool in designing and implementing adaptable e-learning systems.Keywords: adaptable e-learning, HTMLParser, information extraction, semantic web
Procedia PDF Downloads 339