Search results for: choice experiments (CE)
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4820

Search results for: choice experiments (CE)

1700 The Theology of a Muslim Artist: Tawfiq al-Hakim

Authors: Abdul Rahman Chamseddine

Abstract:

Tawfiq al-Hakim remains one of the most prominent playwrights in his native in Egypt, and in the broader Arab world. His works, at the time of their release, drew international attention and acclaim. His first 1933 masterpiece Ahl al-Kahf (The People of the Cave) especially, garnered fame and recognition in both Europe and the Arab world. Borrowing its title from the Qur’anic Sura, al-Hakim’s play relays the untold story of the life of those 'three saints' after they wake up from their prolonged sleep. The playwright’s selection of topics upon which to base his works displays a deep appreciation of Arabic and Islamic heritage. Al-Hakim was clearly influenced by Islam, to such a degree that he wrote the biography of the Prophet Muhammad in 1936 very early in his career. Knowing that Al-Hakim was preceded by many poets and creative writers in writing the Prophet Muhammad’s biography. Notably like Al-Barudi, Ahmad Shawqi, Haykal, Al-‘Aqqad, and Taha Husayn who have had their own ways in expressing their views of the Prophet Muhammad. The attempt to understand the concern of all those renaissance men and others in the person of the Prophet would be indispensable in this study. This project will examine the reasons behind al-Hakim’s choice to draw upon these particular texts, embedded as they are in the context of Arabic and Islamic heritage, and how the use of traditional texts serves his contemporary goals. The project will also analyze the image of Islam in al-Hakim’s imagination. Elsewhere, he envisions letters or conversations between God and himself, which offers a window into understanding the powerful impact of the Divine on Tawfiq al-Hakim, one that informs his literature and merits further scholarly attention. His works occupying a major rank in Arabic literature, does not reveal Al-Hakim solely but the unquestioned assumptions operative in the life of his community, its mental make-up and its attitudes. Furthermore, studying the reception of works that touch on sensitive issues, like writing a letter to God, in Al-Hakim’s historical context would be of a great significance in the process of comprehending the mentality of the Muslim community at that time.

Keywords: Arabic language, Arabic literature, Arabic theology, modern Arabic literature

Procedia PDF Downloads 357
1699 Influence of Readability of Paper-Based Braille on Vertical and Horizontal Dot Spacing in Braille Beginners

Authors: K. Doi, T. Nishimura, H. Fujimoto

Abstract:

The number of people who become visually impaired and do not have sufficient tactile experiences has increased by various disease. Especially, many acquired visually impaired persons due to accidents, disorders, and aging cannot adequately read Braille. It is known that learning Braille requires a great deal of time and the acquisition of various skills. In our previous studies, we reported one of the problems in learning Braille. Concretely, the standard Braille size is too small for Braille beginners. And also we are short of the objective data regarding easily readable Braille size. Therefore, it is necessary to conduct various experiments for evaluating Braille size that would make learning easier for beginners. In this study, for the purpose of investigating easy-to-read conditions of vertical and horizontal dot spacing for beginners, we conducted one Braille reading experiment. In this our experiment, we prepared test pieces by use of our original Braille printer with controlling function of Braille size. We specifically considered Braille beginners with acquired visual impairments who were unfamiliar with Braille. Therefore, ten sighted subjects with no experience of reading Braille participated in this experiment. Size of vertical and horizontal dot spacing was following conditions. Each dot spacing was 2.0, 2.3, 2.5, 2.7, 2.9, 3.1mm. The subjects were asked to read one Braille character with controlled Braille size. The results of this experiment reveal that Braille beginners can read Braille accurately and quickly when both vertical and horizontal dot spacing are 3.1 mm or more. This knowledge will be helpful data in considering Braille size for acquired visually impaired persons.

Keywords: paper-based Braille, vertical and horizontal dot spacing, readability, acquired visual impairment, Braille beginner

Procedia PDF Downloads 172
1698 Finite-Sum Optimization: Adaptivity to Smoothness and Loopless Variance Reduction

Authors: Bastien Batardière, Joon Kwon

Abstract:

For finite-sum optimization, variance-reduced gradient methods (VR) compute at each iteration the gradient of a single function (or of a mini-batch), and yet achieve faster convergence than SGD thanks to a carefully crafted lower-variance stochastic gradient estimator that reuses past gradients. Another important line of research of the past decade in continuous optimization is the adaptive algorithms such as AdaGrad, that dynamically adjust the (possibly coordinate-wise) learning rate to past gradients and thereby adapt to the geometry of the objective function. Variants such as RMSprop and Adam demonstrate outstanding practical performance that have contributed to the success of deep learning. In this work, we present AdaLVR, which combines the AdaGrad algorithm with loopless variance-reduced gradient estimators such as SAGA or L-SVRG that benefits from a straightforward construction and a streamlined analysis. We assess that AdaLVR inherits both good convergence properties from VR methods and the adaptive nature of AdaGrad: in the case of L-smooth convex functions we establish a gradient complexity of O(n + (L + √ nL)/ε) without prior knowledge of L. Numerical experiments demonstrate the superiority of AdaLVR over state-of-the-art methods. Moreover, we empirically show that the RMSprop and Adam algorithm combined with variance-reduced gradients estimators achieve even faster convergence.

Keywords: convex optimization, variance reduction, adaptive algorithms, loopless

Procedia PDF Downloads 66
1697 A Combined Approach Based on Artificial Intelligence and Computer Vision for Qualitative Grading of Rice Grains

Authors: Hemad Zareiforoush, Saeed Minaei, Ahmad Banakar, Mohammad Reza Alizadeh

Abstract:

The quality inspection of rice (Oryza sativa L.) during its various processing stages is very important. In this research, an artificial intelligence-based model coupled with computer vision techniques was developed as a decision support system for qualitative grading of rice grains. For conducting the experiments, first, 25 samples of rice grains with different levels of percentage of broken kernels (PBK) and degree of milling (DOM) were prepared and their qualitative grade was assessed by experienced experts. Then, the quality parameters of the same samples examined by experts were determined using a machine vision system. A grading model was developed based on fuzzy logic theory in MATLAB software for making a relationship between the qualitative characteristics of the product and its quality. Totally, 25 rules were used for qualitative grading based on AND operator and Mamdani inference system. The fuzzy inference system was consisted of two input linguistic variables namely, DOM and PBK, which were obtained by the machine vision system, and one output variable (quality of the product). The model output was finally defuzzified using Center of Maximum (COM) method. In order to evaluate the developed model, the output of the fuzzy system was compared with experts’ assessments. It was revealed that the developed model can estimate the qualitative grade of the product with an accuracy of 95.74%.

Keywords: machine vision, fuzzy logic, rice, quality

Procedia PDF Downloads 415
1696 NOx Prediction by Quasi-Dimensional Combustion Model of Hydrogen Enriched Compressed Natural Gas Engine

Authors: Anas Rao, Hao Duan, Fanhua Ma

Abstract:

The dependency on the fossil fuels can be minimized by using the hydrogen enriched compressed natural gas (HCNG) in the transportation vehicles. However, the NOx emissions of HCNG engines are significantly higher, and this turned to be its major drawback. Therefore, the study of NOx emission of HCNG engines is a very important area of research. In this context, the experiments have been performed at the different hydrogen percentage, ignition timing, air-fuel ratio, manifold-absolute pressure, load and engine speed. Afterwards, the simulation has been accomplished by the quasi-dimensional combustion model of HCNG engine. In order to investigate the NOx emission, the NO mechanism has been coupled to the quasi-dimensional combustion model of HCNG engine. The three NOx mechanism: the thermal NOx, prompt NOx and N2O mechanism have been used to predict NOx emission. For the validation purpose, NO curve has been transformed into NO packets based on the temperature difference of 100 K for the lean-burn and 60 K for stoichiometric condition. While, the width of the packet has been taken as the ratio of crank duration of the packet to the total burnt duration. The combustion chamber of the engine has been divided into three zones, with the zone equal to the product of summation of NO packets and space. In order to check the accuracy of the model, the percentage error of NOx emission has been evaluated, and it lies in the range of ±6% and ±10% for the lean-burn and stoichiometric conditions respectively. Finally, the percentage contribution of each NO formation has been evaluated.

Keywords: quasi-dimensional combustion , thermal NO, prompt NO, NO packet

Procedia PDF Downloads 247
1695 Received Signal Strength Indicator Based Localization of Bluetooth Devices Using Trilateration: An Improved Method for the Visually Impaired People

Authors: Muhammad Irfan Aziz, Thomas Owens, Uzair Khaleeq uz Zaman

Abstract:

The instantaneous and spatial localization for visually impaired people in dynamically changing environments with unexpected hazards and obstacles, is the most demanding and challenging issue faced by the navigation systems today. Since Bluetooth cannot utilize techniques like Time Difference of Arrival (TDOA) and Time of Arrival (TOA), it uses received signal strength indicator (RSSI) to measure Receive Signal Strength (RSS). The measurements using RSSI can be improved significantly by improving the existing methodologies related to RSSI. Therefore, the current paper focuses on proposing an improved method using trilateration for localization of Bluetooth devices for visually impaired people. To validate the method, class 2 Bluetooth devices were used along with the development of a software. Experiments were then conducted to obtain surface plots that showed the signal interferences and other environmental effects. Finally, the results obtained show the surface plots for all Bluetooth modules used along with the strong and weak points depicted as per the color codes in red, yellow and blue. It was concluded that the suggested improved method of measuring RSS using trilateration helped to not only measure signal strength affectively but also highlighted how the signal strength can be influenced by atmospheric conditions such as noise, reflections, etc.

Keywords: Bluetooth, indoor/outdoor localization, received signal strength indicator, visually impaired

Procedia PDF Downloads 130
1694 Computer Simulation to Investigate Magnetic and Wave-Absorbing Properties of Iron Nanoparticles

Authors: Chuan-Wen Liu, Min-Hsien Liu, Chung-Chieh Tai, Bing-Cheng Kuo, Cheng-Lung Chen, Huazhen Shen

Abstract:

A recent surge in research on magnetic radar absorbing materials (RAMs) has presented researchers with new opportunities and challenges. This study was performed to gain a better understanding of the wave-absorbing phenomenon of magnetic RAMs. First, we hypothesized that the absorbing phenomenon is dependent on the particle shape. Using the Material Studio program and the micro-dot magnetic dipoles (MDMD) method, we obtained results from magnetic RAMs to support this hypothesis. The total MDMD energy of disk-like iron particles was greater than that of spherical iron particles. In addition, the particulate aggregation phenomenon decreases the wave-absorbance, according to both experiments and computational data. To conclude, this study may be of importance in terms of explaining the wave- absorbing characteristic of magnetic RAMs. Combining molecular dynamics simulation results and the theory of magnetization of magnetic dots, we investigated the magnetic properties of iron materials with different particle shapes and degrees of aggregation under external magnetic fields. The MDMD of the materials under magnetic fields of various strengths were simulated. Our results suggested that disk-like iron particles had a better magnetization than spherical iron particles. This result could be correlated with the magnetic wave- absorbing property of iron material.

Keywords: wave-absorbing property, magnetic material, micro-dot magnetic dipole, particulate aggregation

Procedia PDF Downloads 486
1693 An Exploratory Study to Understand the Economic Opportunities from Climate Change

Authors: Sharvari Parikh

Abstract:

Climate change has always been looked upon as a threat. Increased use of fossil fuels, depletion of bio diversity, certain human activities, rising levels of Greenhouse Gas (GHG) emissions are the factors that have caused climate change. Climate change is creating new risks and aggravating the existing ones. The paper focuses on breaking the stereotypical perception of climate change and draws attention towards the constructive side of it. Researches around the world have concluded that climate change has provided us with many untapped opportunities. The next 15 years will be crucial, as it is in our hands whether we are able to grab these opportunities or just let the situation get worse. The world stands at a stage where we cannot think of making a choice between averting climate change and promoting growth and development. In fact, the solution to climate change itself has got economic opportunities. The data evidences from the paper show how we can create the opportunity to improve the lives of the world’s population at large through structural change which will promote environment friendly investments. Rising Investment in green energy and increased demand of climate friendly products has got ample of employment opportunities. Old technologies and machinery which are employed today lack efficiency and demand huge maintenance because of which we face high production cost. This can be drastically brought down by adaptation of Green technologies which are more accessible and affordable. Overall GDP of the world has been heavily affected in aggravating the problems arising out of increasing weather problems. Shifting to green economy can not only eliminate these costs but also build a sound economy. Accelerating the economy in direction of low-carbon future can lessen the burdens such as subsidies for fossil fuels, several public debts, unemployment, poverty, reduce healthcare expenses etc. It is clear that the world will be dragged into the ‘Darker phase’ if the current trends of fossil fuels and carbon are being consumed. Switching to Green economy is the only way in which we can lift the world from darker phase. Climate change has opened the gates for ‘Green and Clean economy’. It will also bring countries of the world together in achieving the common goal of Green Economy.

Keywords: climate change, economic opportunities, green economy, green technology

Procedia PDF Downloads 238
1692 Making of Alloy Steel by Direct Alloying with Mineral Oxides during Electro-Slag Remelting

Authors: Vishwas Goel, Kapil Surve, Somnath Basu

Abstract:

In-situ alloying in steel during the electro-slag remelting (ESR) process has already been achieved by the addition of necessary ferroalloys into the electro-slag remelting mold. However, the use of commercially available ferroalloys during ESR processing is often found to be financially less favorable, in comparison with the conventional alloying techniques. However, a process of alloying steel with elements like chromium and manganese using the electro-slag remelting route is under development without any ferrochrome addition. The process utilizes in-situ reduction of refined mineral chromite (Cr₂O₃) and resultant enrichment of chromium in the steel ingot produced. It was established in course of this work that this process can become more advantageous over conventional alloying techniques, both economically and environmentally, for applications which inherently demand the use of the electro-slag remelting process, such as manufacturing of superalloys. A key advantage is the lower overall CO₂ footprint of this process relative to the conventional route of production, storage, and the addition of ferrochrome. In addition to experimentally validating the feasibility of the envisaged reactions, a mathematical model to simulate the reduction of chromium (III) oxide and transfer to chromium to the molten steel droplets was also developed as part of the current work. The developed model helps to correlate the amount of chromite input and the magnitude of chromium alloying that can be achieved through this process. Experiments are in progress to validate the predictions made by this model and to fine-tune its parameters.

Keywords: alloying element, chromite, electro-slag remelting, ferrochrome

Procedia PDF Downloads 219
1691 Characterization on Molecular Weight of Polyamic Acids Using GPC Coupled with Multiple Detectors

Authors: Mei Hong, Wei Liu, Xuemin Dai, Yanxiong Pan, Xiangling Ji

Abstract:

Polyamic acid (PAA) is the precursor of polyimide (PI) prepared by a two-step method, its molecular weight and molecular weight distribution not only play an important role during the preparation and processing, but also influence the final performance of PI. However, precise characterization on molecular weight of PAA is still a challenge because of the existence of very complicated interactions in the solution system, including the electrostatic interaction, hydrogen bond interaction, dipole-dipole interaction, etc. Thus, it is necessary to establisha suitable strategy which can completely suppress these complex effects and get reasonable data on molecular weight. Herein, the gel permeation chromatography (GPC) coupled with differential refractive index (RI) and multi-angle laser light scattering (MALLS) detectors were applied to measure the molecular weight of (6FDA-DMB) PAA using different mobile phases, LiBr/DMF, LiBr/H3PO4/THF/DMF, LiBr/HAc/THF/DMF, and LiBr/HAc/DMF, respectively. It was found that combination of LiBr with HAc can shield the above-mentioned complex interactions and is more conducive to the separation of PAA than only addition of LiBr in DMF. LiBr/HAc/DMF was employed for the first time as a mild mobile phase to effectively separate PAA and determine its molecular weight. After a series of conditional experiments, 0.02M LiBr/0.2M HAc/DMF was fixed as an optimized mobile phase to measure the relative and absolute molecular weights of (6FDA-DMB) PAA prepared, and the obtained Mw from GPC-MALLS and GPC-RI were 35,300 g/mol and 125,000 g/mol, respectively. Particularly, such a mobile phase is also applicable to other PAA samples with different structures, and the final results on molecular weight are also reproducible.

Keywords: Polyamic acids, Polyelectrolyte effects, Gel permeation chromatography, Mobile phase, Molecular weight

Procedia PDF Downloads 52
1690 A Monte Carlo Fuzzy Logistic Regression Framework against Imbalance and Separation

Authors: Georgios Charizanos, Haydar Demirhan, Duygu Icen

Abstract:

Two of the most impactful issues in classical logistic regression are class imbalance and complete separation. These can result in model predictions heavily leaning towards the imbalanced class on the binary response variable or over-fitting issues. Fuzzy methodology offers key solutions for handling these problems. However, most studies propose the transformation of the binary responses into a continuous format limited within [0,1]. This is called the possibilistic approach within fuzzy logistic regression. Following this approach is more aligned with straightforward regression since a logit-link function is not utilized, and fuzzy probabilities are not generated. In contrast, we propose a method of fuzzifying binary response variables that allows for the use of the logit-link function; hence, a probabilistic fuzzy logistic regression model with the Monte Carlo method. The fuzzy probabilities are then classified by selecting a fuzzy threshold. Different combinations of fuzzy and crisp input, output, and coefficients are explored, aiming to understand which of these perform better under different conditions of imbalance and separation. We conduct numerical experiments using both synthetic and real datasets to demonstrate the performance of the fuzzy logistic regression framework against seven crisp machine learning methods. The proposed framework shows better performance irrespective of the degree of imbalance and presence of separation in the data, while the considered machine learning methods are significantly impacted.

Keywords: fuzzy logistic regression, fuzzy, logistic, machine learning

Procedia PDF Downloads 65
1689 Two-stage Robust Optimization for Collaborative Distribution Network Design Under Uncertainty

Authors: Reza Alikhani

Abstract:

This research focuses on the establishment of horizontal cooperation among companies to enhance their operational efficiency and competitiveness. The study proposes an approach to horizontal collaboration, called coalition configuration, which involves partnering companies sharing distribution centers in a network design problem. The paper investigates which coalition should be formed in each distribution center to minimize the total cost of the network. Moreover, potential uncertainties, such as operational and disruption risks, are considered during the collaborative design phase. To address this problem, a two-stage robust optimization model for collaborative distribution network design under surging demand and facility disruptions is presented, along with a column-and-constraint generation algorithm to obtain exact solutions tailored to the proposed formulation. Extensive numerical experiments are conducted to analyze solutions obtained by the model in various scenarios, including decisions ranging from fully centralized to fully decentralized settings, collaborative versus non-collaborative approaches, and different amounts of uncertainty budgets. The results show that the coalition formation mechanism proposes some solutions that are competitive with the savings of the grand coalition. The research also highlights that collaboration increases network flexibility and resilience while reducing costs associated with demand and capacity uncertainties.

Keywords: logistics, warehouse sharing, robust facility location, collaboration for resilience

Procedia PDF Downloads 66
1688 A Study on Effect of Dynamic Loading Speed on the Fracture Toughness of Equivalent Stress Gradient (ESG) Specimen

Authors: Moon Byung Woo, Seok Chang-Sung, Koo Jae-Mean, Kim Sang-Young, Choi Jae Gu, Huh Nam-Su

Abstract:

Recently, the occurrence of the earthquake has increased sharply and many of the casualties have occurred worldwide, due to the influence of earthquakes. Especially, the Fukushima nuclear power plant accident which was caused by the earthquake in 2011 has significantly increased the fear of people and the demand for the safety of the nuclear power plant. Thus, in order to prevent the earthquake accident at nuclear power plant, it is important to evaluate the fracture toughness considering the seismic loading rate. To obtain fracture toughness for the safety evaluation of nuclear power plant, it is desirable to perform experiments with a real scale pipe which is expensive and hard to perform. Therefore, many researchers have proposed various test specimens to replicate the fracture toughness of a real scale pipe. Since such specimens have several problems, the equivalent stress gradient (ESG) specimen has been recently suggested. In this study, in order to consider the effects of the dynamic loading speed on fracture toughness, the experiment was conducted by applying five different kinds of test speeds using an ESG specimen. In addition, after we performed the fracture toughness test under dynamic loading with different speeds using an ESG specimen and a standard specimen, we compared them with the test results under static loading.

Keywords: dynamic loading speed, fracture toughness, load-ratio-method, equivalent stress gradient (ESG) specimen

Procedia PDF Downloads 305
1687 Prediction of Flow Around a NACA 0015 Profile

Authors: Boukhadia Karima

Abstract:

The fluid mechanics is the study of fluid motion laws and their interaction with solid bodies, this project leads to illustrate this interaction with depth studies and approved by experiments on the wind tunnel TE44, ensuring the efficiency, accuracy and reliability of these tests on a NACA0015 profile. A symmetric NACA0015 was placed in a subsonic wind tunnel, and measurements were made of the pressure on the upper and lower surface of the wing and of the velocity across the vortex trailing downstream from the tip of the wing. The aim of this work is to investigate experimentally the scattered pressure profile in a free airflow and the aerodynamic forces acting on this profile. The addition of around-lateral edge to the wing tip was found to eliminate the secondary vortex near the wing tip, but had little effect on the downstream characteristics of the trailing vortex. The increase in wing lift near the tip because of the presence of the trailing vortex was evident in the surface pressure, but was not captured by circulation-box measurements. The circumferential velocity within the vortex was found to reach free-stream values and produce core rotational speeds. Near the wing, the trailing vortex is asymmetric and contains definite zones where the stream wise velocity both exceeds and falls behind the free-stream value. When referenced to the free stream velocity, the maximum vertical velocity of the vortex is directly dependent on α and is independent of Re. A numerical study was conducted through a CFD code called FLUENT 6.0, and the results are compared with experimental.

Keywords: CFD code, NACA Profile, detachment, angle of incidence, wind tunnel

Procedia PDF Downloads 408
1686 Distances over Incomplete Diabetes and Breast Cancer Data Based on Bhattacharyya Distance

Authors: Loai AbdAllah, Mahmoud Kaiyal

Abstract:

Missing values in real-world datasets are a common problem. Many algorithms were developed to deal with this problem, most of them replace the missing values with a fixed value that was computed based on the observed values. In our work, we used a distance function based on Bhattacharyya distance to measure the distance between objects with missing values. Bhattacharyya distance, which measures the similarity of two probability distributions. The proposed distance distinguishes between known and unknown values. Where the distance between two known values is the Mahalanobis distance. When, on the other hand, one of them is missing the distance is computed based on the distribution of the known values, for the coordinate that contains the missing value. This method was integrated with Wikaya, a digital health company developing a platform that helps to improve prevention of chronic diseases such as diabetes and cancer. In order for Wikaya’s recommendation system to work distance between users need to be measured. Since there are missing values in the collected data, there is a need to develop a distance function distances between incomplete users profiles. To evaluate the accuracy of the proposed distance function in reflecting the actual similarity between different objects, when some of them contain missing values, we integrated it within the framework of k nearest neighbors (kNN) classifier, since its computation is based only on the similarity between objects. To validate this, we ran the algorithm over diabetes and breast cancer datasets, standard benchmark datasets from the UCI repository. Our experiments show that kNN classifier using our proposed distance function outperforms the kNN using other existing methods.

Keywords: missing values, incomplete data, distance, incomplete diabetes data

Procedia PDF Downloads 221
1685 Malachite Ore Treatment with Typical Ammonium Salts and Its Mechanism to Promote the Flotation Performance

Authors: Ayman M. Ibrahim, Jinpeng Cai, Peilun Shen, Dianwen Liu

Abstract:

The difference in promoting sulfurization between different ammonium salts and its anion's effect on the sulfurization of the malachite surface was systematically studied. Therefore, this study takes malachite, a typical copper oxide mineral, as the research object, field emission scanning electron microscopy and energy-dispersive X-ray analysis (FESEM‒EDS), X-ray photoelectron spectroscopy (XPS), and other analytical and testing methods, as well as pure mineral flotation experiments, were carried out to examine the superiority of the ammonium salts as the sulfurizing reagent of malachite at the microscopic level. Additionally, the promoting effects of ammonium sulfate and ammonium phosphate on the malachite sulfurization of xanthate-flotation were compared systematically from the microstructure of sulfurized products, elemental composition, chemical state of characteristic elements, and hydrophobicity surface evolution. The FESEM and AFM results presented that after being pre-treated with ammonium salts, the adhesion of sulfurized products formed on the mineral surface was denser; thus, the flake radial dimension product was significantly greater. For malachite sulfurization flotation, the impact of ammonium phosphate in promoting sulfurization is weaker than ammonium sulfate. The reason may be that hydrolyzing phosphate consumes a substantial quantity of H+ in the solution, which hastens the formation of the copper-sulfur products, decreasing the adhesion stability of copper-sulfur species on the malachite surface.

Keywords: sulfurization flotation, adsorption characteristics, malachite, hydrophobicity

Procedia PDF Downloads 65
1684 The Noun-Phrase Elements on the Usage of the Zero Article

Authors: Wen Zhen

Abstract:

Compared to content words, function words have been relatively overlooked by English learners especially articles. The article system, to a certain extent, becomes a resistance to know English better, driven by different elements. Three principal factors can be summarized in term of the nature of the articles when referring to the difficulty of the English article system. However, making the article system more complex are difficulties in the second acquisition process, for [-ART] learners have to create another category, causing even most non-native speakers at proficiency level to make errors. According to the sequences of acquisition of the English article, it is showed that the zero article is first acquired and in high inaccuracy. The zero article is often overused in the early stages of L2 acquisition. Although learners at the intermediate level move to underuse the zero article for they realize that the zero article does not cover any case, overproduction of the zero article even occurs among advanced L2 learners. The aim of the study is to investigate noun-phrase factors which give rise to incorrect usage or overuse of the zero article, thus providing suggestions for L2 English acquisition. Moreover, it enables teachers to carry out effective instruction that activate conscious learning of students. The research question will be answered through a corpus-based, data- driven approach to analyze the noun-phrase elements from the semantic context and countability of noun-phrases. Based on the analysis of the International Thurber Thesis corpus, the results show that: (1) Although context of [-definite,-specific] favored the zero article, both[-definite,+specific] and [+definite,-specific] showed less influence. When we reflect on the frequency order of the zero article , prototypicality plays a vital role in it .(2)EFL learners in this study have trouble classifying abstract nouns as countable. We can find that it will bring about overuse of the zero article when learners can not make clear judgements on countability altered from (+definite ) to (-definite).Once a noun is perceived as uncountable by learners, the choice would fall back on the zero article. These findings suggest that learners should be engaged in recognition of the countability of new vocabulary by explaining nouns in lexical phrases and explore more complex aspects such as analysis dependent on discourse.

Keywords: noun phrase, zero article, corpus, second language acquisition

Procedia PDF Downloads 250
1683 Effect of Environmental Factors on Photoreactivation of Microorganisms under Indoor Conditions

Authors: Shirin Shafaei, James R. Bolton, Mohamed Gamal El Din

Abstract:

Ultraviolet (UV) disinfection causes damage to the DNA or RNA of microorganisms, but many microorganisms can repair this damage after exposure to near-UV or visible wavelengths (310–480 nm) by a mechanism called photoreactivation. Photoreactivation is gaining more attention because it can reduce the efficiency of UV disinfection of wastewater several hours after treatment. The focus of many photoreactivation research activities on the single species has caused a considerable lack in knowledge about complex natural communities of microorganisms and their response to UV treatment. In this research, photoreactivation experiments were carried out on the influent of the UV disinfection unit at a municipal wastewater treatment plant (WWTP) in Edmonton, Alberta after exposure to a Medium-Pressure (MP) UV lamp system to evaluate the effect of environmental factors on photoreactivation of microorganisms in the actual municipal wastewater. The effect of reactivation fluence, temperature, and river water on photoreactivation of total coliforms was examined under indoor conditions. The results showed that higher effective reactivation fluence values (up to 20 J/cm2) and higher temperatures (up to 25 °C) increased the photoreactivation of total coliforms. However, increasing the percentage of river in the mixtures of the effluent and river water decreased the photoreactivation of the mixtures. The results of this research can help the municipal wastewater treatment industry to examine the environmental effects of discharging their effluents into receiving waters.

Keywords: photoreactivation, reactivation fluence, river water, temperature, ultraviolet disinfection, wastewater effluent

Procedia PDF Downloads 301
1682 Enhanced Boiling Heat Transfer Using Wettability Patterned Surfaces

Authors: Dong Il Shim, Geehong Choi, Donghwi Lee, Namkyu Lee, Hyung Hee Cho

Abstract:

Effective cooling technology is required to secure thermal stability in extreme heat generated systems such as integrated electronic devices and power generated systems. Pool boiling heat transfer is one of the powerful cooling mechanisms using phase change phenomena. Critical heat flux (CHF) and heat transfer coefficient (HTC) are main factors to evaluate the performance of boiling heat transfer. CHF is the limitation of boiling heat transfer before film boiling which occurs thermal failure. Surface wettability is an important surface characteristic of boiling heat transfer. A hydrophilic surface has higher CHF through effective working fluid supply to local hot spots. A hydrophobic surface promotes the onset of nucleate boiling (ONB) to enhance HTC. In this study, superbiphilic surfaces, which is combined with superhydrophillic and superhydrophobic, are applied on boiling experiments to maximize boiling performance. We conducted pool boiling heat transfer using DI water at a saturated temperature and recorded bubble dynamics using a high-speed camera with 2000 fps. As a result, superbiphilic patterned surfaces promote ONB and enhance both CHF and HTC. This study demonstrates the enhanced boiling performance using superbiphilic surfaces by effective nucleation and separation of liquid/vapor pathway. We expect that further enhancement of heat transfer could be achieved in future work using optimized patterned surfaces.

Keywords: boiling heat transfer, wettability, critical heat flux, heat transfer coefficient

Procedia PDF Downloads 328
1681 Nanosilver Containing Biodegradable Bionanocomposites for Antimicrobial Application: Design, Preparation and Study

Authors: Nino Kupatadze, Shorena Tskhadadze, Mzevinar Bedinashvili, David Tugushi, Ramaz Katsarava

Abstract:

Surgical device-associated infection and biofilm formation are some of the major problems in biomedicine for today. The losing protection ability of conventional antimicrobial-drugs leads to the challenges in the current antibiotic therapy, the most serious of which is antibiotic resistance. Our strategy to overcome the biofilm formation consists in coating devices with polymeric film containing nanosilver(AgNPs) as a bactericidal agent. Such bionanocomposites are also promising as wound dressing materials. For this purpose, we have developed a new generation of AgNPs containing polymeric composites in which amino acid based biodegradable poly(ester amide)s (PEAs) were served as both matrices and AgNPs stabilizers. The AgNPs were formed by photochemical (daylight) reduction of AgNO3 in ethanol solution. The formation of AgNPs was monitored by coloring the solution in brownish-red and appearance of the absorption maximum at 420-430 nm in UV spectrum. Comparative studies of PEAs with polyvinylpyrrolidone (PVP) as particle stabilizers were carried out. It was found that PVP is better stabilizer in terms of particles yield and stability. Therefore, in subsequent experiments blends of PEAs and PVP were used as stabilizers for fabricating AgNPs. As expected, PVP increased the stabilizing effect and this apparently observed in the UV spectrum of the samples after 7 h daylight irradiation: for pure PVP λmax = 430 nm, D = 2.03, for pure PEA λmax= 420 nm, D = 0.65, and for the blend of PVP and PEA λmax = 435 nm, D = 1.88. Further study of the obtained nanobiocomposites is in progress now.

Keywords: biodegradation, bionanocompositions, polymer, nanosilver

Procedia PDF Downloads 340
1680 Generation of Electro-Encephalography Readiness Potentials by Intention

Authors: Seokbeen Lim, Gilwon Yoon

Abstract:

The readiness potential in brain waves is a brain activity related with an intention whose potential arises even before its conscious intention. This study was carried out in order to understand the generation and mechanism of the readiness potential more. The experiment with two subjects was conducted in two ways following the Oddball task protocol. Firstly, auditory stimuli were randomly presented to the subjects. The subject was allowed to press the keyboard with the right index finger only when the subject heard the target stimulus but not the standard stimulus. Secondly, unlike the first one, the auditory stimuli were randomly presented, and the subjects pressed the keyboard in the same manner, but at the same time with grasping action of the left hand. The readiness potential showed up for both of these experiments. In the first Oddball experiment, the readiness potential was detected only when the target stimulus was presented. However, in the second Oddball experiment with the left hand action of grasping something, the readiness potential was detected at the presentation of for both standard and target stimuli. However, detected readiness potentials with the target stimuli were larger than those of the standard stimuli. We found an interesting phenomenon that the readiness potential was able to be detected even the standard stimulus. This indicates that motor-related readiness potentials can be generated only by the intention to move. These results present a new perspective in psychology and brain engineering since subconscious brain action may be prior to conscious recognition of the intention.

Keywords: readiness potential, auditory stimuli, event-related potential, electroencephalography, oddball task

Procedia PDF Downloads 201
1679 An Improved Discrete Version of Teaching–Learning-Based ‎Optimization for Supply Chain Network Design

Authors: Ehsan Yadegari

Abstract:

While there are several metaheuristics and exact approaches to solving the Supply Chain Network Design (SCND) problem, there still remains an unfilled gap in using the Teaching-Learning-Based Optimization (TLBO) algorithm. The algorithm has demonstrated desirable results with problems with complicated combinational optimization. The present study introduces a Discrete Self-Study TLBO (DSS-TLBO) with priority-based solution representation that can solve a supply chain network configuration model to lower the total expenses of establishing facilities and the flow of materials. The network features four layers, namely suppliers, plants, distribution centers (DCs), and customer zones. It is designed to meet the customer’s demand through transporting the material between layers of network and providing facilities in the best economic Potential locations. To have a higher quality of the solution and increase the speed of TLBO, a distinct operator was introduced that ensures self-adaptation (self-study) in the algorithm based on the four types of local search. In addition, while TLBO is used in continuous solution representation and priority-based solution representation is discrete, a few modifications were added to the algorithm to remove the solutions that are infeasible. As shown by the results of experiments, the superiority of DSS-TLBO compared to pure TLBO, genetic algorithm (GA) and firefly Algorithm (FA) was established.

Keywords: supply chain network design, teaching–learning-based optimization, improved metaheuristics, discrete solution representation

Procedia PDF Downloads 49
1678 Euthanasia with Reference to Defective Newborns: An Analysis

Authors: Nibedita Priyadarsini

Abstract:

It is said that Ethics has a wide range of application which mainly deals with human life and human behavior. All ethical decisions are ultimately concerned with life and death. Both life and death must be considered dignified. Medical ethics with its different topics mostly deals with life and death concepts among which euthanasia is one. Various types of debates continue over Euthanasia long since. The question of putting an end to someone’s life has aroused controversial in legal sphere as well as in moral sphere. To permit or not to permit has remained an enigma the world over. Modern medicine is in the stage of transcending limits that cannot be set aside. The morality of allowing people to die without treatment has become more important as methods of treatment have become more sophisticated. Allowing someone to die states an essential recognition that there is some point in any terminal illness when further curative treatment has no purpose and the patient in such situation should allow dying a natural death in comfort, peace, and dignity, without any interference from medical science and technology. But taking a human life is in general sense is illogical in itself. It can be said that when we kill someone, we cause the death; whereas if we merely let someone die, then we will not be responsible for anyone’s death. This point is often made in connection with the euthanasia cases and which is often debatable. Euthanasia in the pediatric age group involves some important issues that are different from those of adult issues. The main distinction that occurs is that the infants and newborns and young children are not able to decide about their future as the adult does. In certain cases, where the child born with some serious deformities with no hope of recovery, in that cases doctor decide not to perform surgery in order to remove the blockage, and let the baby die. Our aim in this paper is to examine, whether it is ethically justified to withhold or to apply euthanasia on the part of the defective infant. What to do with severely defective infants from earliest time if got to know that they are not going to survive at all? Here, it will deal mostly with the ethics in deciding the relevant ethical concerns in the practice of euthanasia with the defective newborns issues. Some cases in relation to disabled infants and newborn baby will be taken in order to show what to do in a critical condition, that the patient and family members undergoes and under which condition those could be eradicated, if not all but some. The final choice must be with the benefit of the patient.

Keywords: ethics, medical ethics, euthanasia, defective newborns

Procedia PDF Downloads 201
1677 High Motivational Salient Face Distractors Slowed Target Detection: Evidence from Behavioral Studies

Authors: Rashmi Gupta

Abstract:

Rewarding stimuli capture attention involuntarily as a result of an association process that develops quickly during value learning, referred to as the reward or value-driven attentional capture. It is essential to compare reward with punishment processing to get a full picture of value-based modulation in visual attention processing. Hence, the present study manipulated both valence/value (reward as well as punishment) and motivational salience (probability of an outcome: high vs. low) together. Series of experiments were conducted, and there were two phases in each experiment. In phase 1, participants were required to learn to associate specific face stimuli with a high or low probability of winning or losing points. In the second phase, these conditioned stimuli then served as a distractor or prime in a speeded letter search task. Faces with high versus low outcome probability, regardless of valence, slowed the search for targets (specifically the left visual field target) and suggesting that the costs to performance on non-emotional cognitive tasks were only driven by motivational salience (high vs. loss) associated with the stimuli rather than the valence (gain vs. loss). It also suggests that the processing of motivationally salient stimuli is right-hemisphere biased. Together, results of these studies strengthen the notion that our visual attention system is more sensitive to affected by motivational saliency rather than valence, which termed here as motivational-driven attentional capture.

Keywords: attention, distractors, motivational salience, valence

Procedia PDF Downloads 217
1676 To Upgrade Quality Services of Fashion Designer by Minimizing thought Communication Gap, Using the Projective Personality Tests

Authors: A. Hira Masood, B. Umer Hameed, C. Ezza Nasir

Abstract:

Contemporary studies support the strong co-relation between psychology and design. This study elaborates how different psychological personality test can help a fashion designer to judge the needs of their clients with respect to have products which will satisfy the client's request concerning costumised clothing. This study will also help the designer to improve the lacking in the personality and will enable him to put his effort in required areas for grooming the customer, control and direct organization regarding quality maintenance. The use of psychology test to support the choice of certain design strategies that how the right clothing can make client a better intellectual with enhanced self-esteem and confidence. Different projective personality test are being used to suggest to evaluate personality traits. The Rorschach Inkblot Test is projective mental comprising of 10 ink-blots synonymous with the clinical brain research. Lüsher Color Diagnostics measures a person’s psycho physical state, his or her ability to withstand stress to perform and communicate. HTP is a projective responsibility test measuring self-perception, attitudes. The TAT test intend to evaluate a person’s patterns of thoughts, attitudes, observation, capacity and emotional response to this ambiguous test materials. No doubt designers are already crucially redesigning the individuals by their attires, but to expose the behavioral mechanism of the customer, designers should be able to recognize the hidden complexity behind his client by using the above mentioned methods. The study positively finds the design and psychology need to become substantially contacted in order to create a new regime of norms to groom a personality under the concentration and services of a fashion designer in terms of clothing, This interactive activity altimately upgrade design team to help customers to find the suited way to satisfy their needs and wishes, offer client relible relationship and quality management services, and to become more disereable.

Keywords: projective personality tests, customized clothing, Rorschach Inkblot test, TAT, HTP, Lüsher color diagnostics, quality management services

Procedia PDF Downloads 551
1675 Traffic Prediction with Raw Data Utilization and Context Building

Authors: Zhou Yang, Heli Sun, Jianbin Huang, Jizhong Zhao, Shaojie Qiao

Abstract:

Traffic prediction is essential in a multitude of ways in modern urban life. The researchers of earlier work in this domain carry out the investigation chiefly with two major focuses: (1) the accurate forecast of future values in multiple time series and (2) knowledge extraction from spatial-temporal correlations. However, two key considerations for traffic prediction are often missed: the completeness of raw data and the full context of the prediction timestamp. Concentrating on the two drawbacks of earlier work, we devise an approach that can address these issues in a two-phase framework. First, we utilize the raw trajectories to a greater extent through building a VLA table and data compression. We obtain the intra-trajectory features with graph-based encoding and the intertrajectory ones with a grid-based model and the technique of back projection that restore their surrounding high-resolution spatial-temporal environment. To the best of our knowledge, we are the first to study direct feature extraction from raw trajectories for traffic prediction and attempt the use of raw data with the least degree of reduction. In the prediction phase, we provide a broader context for the prediction timestamp by taking into account the information that are around it in the training dataset. Extensive experiments on several well-known datasets have verified the effectiveness of our solution that combines the strength of raw trajectory data and prediction context. In terms of performance, our approach surpasses several state-of-the-art methods for traffic prediction.

Keywords: traffic prediction, raw data utilization, context building, data reduction

Procedia PDF Downloads 125
1674 Therapeutic Drug Monitoring by Dried Blood Spot and LC-MS/MS: Novel Application to Carbamazepine and Its Metabolite in Paediatric Population

Authors: Giancarlo La Marca, Engy Shokry, Fabio Villanelli

Abstract:

Epilepsy is one of the most common neurological disorders, with an estimated prevalence of 50 million people worldwide. Twenty five percent of the epilepsy population is represented in children under the age of 15 years. For antiepileptic drugs (AED), there is a poor correlation between plasma concentration and dose especially in children. This was attributed to greater pharmacokinetic variability than adults. Hence, therapeutic drug monitoring (TDM) is recommended in controlling toxicity while drug exposure is maintained. Carbamazepine (CBZ) is a first-line AED and the drug of first choice in trigeminal neuralgia. CBZ is metabolised in the liver into carbamazepine-10,11-epoxide (CBZE), its major metabolite which is equipotent. This develops the need for an assay able to monitor the levels of both CBZ and CBZE. The aim of the present study was to develop and validate a LC-MS/MS method for simultaneous quantification of CBZ and CBZE in dried blood spots (DBS). DBS technique overcomes many logistical problems, ethical issues and technical challenges faced by classical plasma sampling. LC-MS/MS has been regarded as superior technique over immunoassays and HPLC/UV methods owing to its better specificity and sensitivity, lack of interference or matrix effects. Our method combines advantages of DBS technique and LC-MS/MS in clinical practice. The extraction process was done using methanol-water-formic acid (80:20:0.1, v/v/v). The chromatographic elution was achieved by using a linear gradient with a mobile phase consisting of acetonitrile-water-0.1% formic acid at a flow rate of 0.50 mL/min. The method was linear over the range 1-40 mg/L and 0.25-20 mg/L for CBZ and CBZE respectively. The limit of quantification was 1.00 mg/L and 0.25 mg/L for CBZ and CBZE, respectively. Intra-day and inter-day assay precisions were found to be less than 6.5% and 11.8%. An evaluation of DBS technique was performed, including effect of extraction solvent, spot homogeneity and stability in DBS. Results from a comparison with the plasma assay are also presented. The novelty of the present work lies in being the first to quantify CBZ and its metabolite from only one 3.2 mm DBS disc finger-prick sample (3.3-3.4 µl blood) by LC-MS/MS in a 10 min. chromatographic run.

Keywords: carbamazepine, carbamazepine-10, 11-epoxide, dried blood spots, LC-MS/MS, therapeutic drug monitoring

Procedia PDF Downloads 411
1673 Comparison of Methods for the Detection of Biofilm Formation in Yeast and Lactic Acid Bacteria Species Isolated from Dairy Products

Authors: Goksen Arik, Mihriban Korukluoglu

Abstract:

Lactic acid bacteria (LAB) and some yeast species are common microorganisms found in dairy products and most of them are responsible for the fermentation of foods. Such cultures are isolated and used as a starter culture in the food industry because of providing standardisation of the final product during the food processing. Choice of starter culture is the most important step for the production of fermented food. Isolated LAB and yeast cultures which have the ability to create a biofilm layer can be preferred as a starter in the food industry. The biofilm formation could be beneficial to extend the period of usage time of microorganisms as a starter. On the other hand, it is an undesirable property in pathogens, since biofilm structure allows a microorganism become more resistant to stress conditions such as antibiotic presence. It is thought that the resistance mechanism could be turned into an advantage by promoting the effective microorganisms which are used in the food industry as starter culture and also which have potential to stimulate the gastrointestinal system. Development of the biofilm layer is observed in some LAB and yeast strains. The resistance could make LAB and yeast strains dominant microflora in the human gastrointestinal system; thus, competition against pathogen microorganisms can be provided more easily. Based on this circumstance, in the study, 10 LAB and 10 yeast strains were isolated from various dairy products, such as cheese, yoghurt, kefir, and cream. Samples were obtained from farmer markets and bazaars in Bursa, Turkey. As a part of this research, all isolated strains were identified and their ability of biofilm formation was detected with two different methods and compared with each other. The first goal of this research was to determine whether isolates have the potential for biofilm production, and the second was to compare the validity of two different methods, which are known as “Tube method” and “96-well plate-based method”. This study may offer an insight into developing a point of view about biofilm formation and its beneficial properties in LAB and yeast cultures used as a starter in the food industry.

Keywords: biofilm, dairy products, lactic acid bacteria, yeast

Procedia PDF Downloads 257
1672 Modern Scotland Yard: Improving Surveillance Policies Using Adversarial Agent-Based Modelling and Reinforcement Learning

Authors: Olaf Visker, Arnout De Vries, Lambert Schomaker

Abstract:

Predictive policing refers to the usage of analytical techniques to identify potential criminal activity. It has been widely implemented by various police departments. Being a relatively new area of research, there are, to the author’s knowledge, no absolute tried, and true methods and they still exhibit a variety of potential problems. One of those problems is closely related to the lack of understanding of how acting on these prediction influence crime itself. The goal of law enforcement is ultimately crime reduction. As such, a policy needs to be established that best facilitates this goal. This research aims to find such a policy by using adversarial agent-based modeling in combination with modern reinforcement learning techniques. It is presented here that a baseline model for both law enforcement and criminal agents and compare their performance to their respective reinforcement models. The experiments show that our smart law enforcement model is capable of reducing crime by making more deliberate choices regarding the locations of potential criminal activity. Furthermore, it is shown that the smart criminal model presents behavior consistent with popular crime theories and outperforms the baseline model in terms of crimes committed and time to capture. It does, however, still suffer from the difficulties of capturing long term rewards and learning how to handle multiple opposing goals.

Keywords: adversarial, agent based modelling, predictive policing, reinforcement learning

Procedia PDF Downloads 145
1671 Effect of Salicylic Acid and Nitrogen Fertilizer on Wheat Growth and Yield

Authors: Omar Ibrahim, Aly A. Gaafar, K. A. Ratib

Abstract:

Two field experiments in micro plots were carried out during the winter seasons of 2012/2013 and 2013/2014, Soil Salinity Laboratory, Alexandria, Egypt, to study the effect of three levels of salicylic acid (SA) as a growth regulator (0, 50, 100 ppm) and three rates of nitrogen fertilizer (75, 100, 125 kg N/feddan) on growth and yield of a spring wheat (Giza 168). The experimental design was a split plot with the main plots in randomized complete block design (RCBD) and four replicates. The results indicated that increasing nitrogen fertilizer rates resulted in insignificant effect on both plant height (cm) and grain weight/spike only. However, a significant effect was observed in all the other studied characters due to the increase in nitrogen fertilizer. On the other hand, increasing salicylic acid rates resulted in insignificant effect in all the studied characters except for chlorophyll a, chlorophyll b, number of grain/spike, and grain yield (gm/ plot). The highest effects on grain yield in wheat were obtained by the rate of 125 kg/feddan of nitrogen fertilizer and 100 ppm of salicylic acid. In conclusion, the data indicated that a high grain yield could be obtained by adding 100 kg/feddan of nitrogen fertilizer and spraying of 50 ppm of salicylic acid with no significant difference with the highest rates. Finally, the interaction had no significant effect on all the studied characters.

Keywords: growth regulator, nitrogen fertilizer, spring wheat, salicylic acid

Procedia PDF Downloads 113