Search results for: continuous mode changing
1670 Prevalence of Multidrug-resistant Escherichia coli Isolated from Ready to Eat: Crispy Fried Chicken in Jember, Indonesia
Authors: Enny Suswati, Supangat Supangat
Abstract:
Background. Ready-to-eat food products are becoming increasingly popular because consumers are increasingly busy, competitive, and changing lifestyles. Examples of ready-to-eat foods include crispy fried chicken. Escherichia coli is one of the most important causes of food-borne diseases and the most frequent antibiotic-resistant pathogen globally. This study assessed the prevalence and antibiotic resistance profile of E. coli from ready-to-eat crispy fried chicken in Jember city, Indonesia. Methodology. This cross-sectional study was conducted from November 2020 to April 2021 by collecting 81crispy fried chicken samples from 27 food stalls in campus area using a simple random sampling method. Isolation and determination of E. coli use were performed by conventional culture method. An antibiotic susceptibility test was conducted using Kirby Bauer disk diffusion method on the Mueller–Hinton agar. Result. Out of 81crispy fried chicken samples, 77 (95.06%) were positive for E. coli. High E. coli drug resistance was observed on ampicillin, amoxicillin (100%) followed by cefixime (98.72%), erythromycin (97.59%), sulfamethoxazole (93.59%), azithromicin (83.33%), cefotaxime (78.28%), choramphenicol (75.64%), and cefixime (74.36%). On the other hand, there was the highest susceptibility for ciprofloxacin (64.10%). The multiple antibiotic resistance indexes of E. coli isolates varied from 0.4 to 1. The predominant antimicrobial resistance profiles of E. coli were CfmCroAmlAmpAzmCtxSxtCE (n=17), CfmCroAmlCipAmpAzmCtxSxtCE (n=16), and CfmAmlAmpAzmCtxSxtCE (n=5), respectively. Multidrug resistance was also found in the isolates' 76/77 (98.70%). Conclusion. The resistance pattern CfmCroAmlAmpAzmCtxSxtCE was the most common among the E. coli isolates, with 17 showing it. The multiple antibiotic index (MAR index) ranged from 0.4 to 1. Hygienic measures should be rigorously implemented and monitoring resistance of E. coli is required to reduce the risks related to the emergence of multi-resistant bacteriaKeywords: antibacterial drug, ready to eat, crispy fried chicken, escherichia coli
Procedia PDF Downloads 1111669 Brown-Spot Needle Blight: An Emerging Threat Causing Loblolly Pine Needle Defoliation in Alabama, USA
Authors: Debit Datta, Jeffrey J. Coleman, Scott A. Enebak, Lori G. Eckhardt
Abstract:
Loblolly pine (Pinus taeda) is a leading productive timber species in the southeastern USA. Over the past three years, an emerging threat is expressed by successive needle defoliation followed by stunted growth and tree mortality in loblolly pine plantations. Considering economic significance, it has now become a rising concern among landowners, forest managers, and forest health state cooperators. However, the symptoms of the disease were perplexed somewhat with root disease(s) and recurrently attributed to invasive Phytophthora species due to the similarity of disease nature and devastation. Therefore, the study investigated the potential causal agent of this disease and characterized the fungi associated with loblolly pine needle defoliation in the southeastern USA. Besides, 70 trees were selected at seven long-term monitoring plots at Chatom, Alabama, to monitor and record the annual disease incidence and severity. Based on colony morphology and ITS-rDNA sequence data, a total of 28 species of fungi representing 17 families have been recovered from diseased loblolly pine needles. The native brown-spot pathogen, Lecanosticta acicola, was the species most frequently recovered from unhealthy loblolly pine needles in combination with some other common needle cast and rust pathogen(s). Identification was confirmed using morphological similarity and amplification of translation elongation factor 1-alpha gene region of interest. Tagged trees were consistently found chlorotic and defoliated from 2019 to 2020. The current emergence of the brown-spot pathogen causing loblolly pine mortality necessitates the investigation of the role of changing climatic conditions, which might be associated with increased pathogen pressure to loblolly pines in the southeastern USA.Keywords: brown-spot needle blight, loblolly pine, needle defoliation, plantation forestry
Procedia PDF Downloads 1521668 Non-Cognitive Skills Associated with Learning in a Serious Gaming Environment: A Pretest-Posttest Experimental Design
Authors: Tanja Kreitenweis
Abstract:
Lifelong learning is increasingly seen as essential for coping with the rapidly changing work environment. To this end, serious games can provide convenient and straightforward access to complex knowledge for all age groups. However, learning achievements depend largely on a learner’s non-cognitive skill disposition (e.g., motivation, self-belief, playfulness, and openness). With the aim of combining the fields of serious games and non-cognitive skills, this research focuses in particular on the use of a business simulation, which conveys change management insights. Business simulations are a subset of serious games and are perceived as a non-traditional learning method. The presented objectives of this work are versatile: (1) developing a scale, which measures learners’ knowledge and skills level before and after a business simulation was played, (2) investigating the influence of non-cognitive skills on learning in this business simulation environment and (3) exploring the moderating role of team preference in this type of learning setting. First, expert interviews have been conducted to develop an appropriate measure for learners’ skills and knowledge assessment. A pretest-posttest experimental design with German management students was implemented to approach the remaining objectives. By using the newly developed, reliable measure, it was found that students’ skills and knowledge state were higher after the simulation had been played, compared to before. A hierarchical regression analysis revealed two positive predictors for this outcome: motivation and self-esteem. Unexpectedly, playfulness had a negative impact. Team preference strengthened the link between grit and playfulness, respectively, and learners’ skills and knowledge state after completing the business simulation. Overall, the data underlined the potential of business simulations to improve learners’ skills and knowledge state. In addition, motivational factors were found as predictors for benefitting most from the applied business simulation. Recommendations are provided for how pedagogues can use these findings.Keywords: business simulations, change management, (experiential) learning, non-cognitive skills, serious games
Procedia PDF Downloads 1081667 Potential Application of Thyme (Thymus vulgaris L.) Essential Oil as Antibacterial Drug in Aromatherapy
Authors: Ferhat Mohamed Amine, Boukhatem Mohamed Nadjib, Chemat Farid
Abstract:
The Lamiaceae family is widely spread in Algeria. Due to the application of Thymus species growing wild in Algeria as a culinary herb and in folk medicine, the purpose of the present work was to evaluate antimicrobial activities of their essential oils and relate them with their chemical composition, for further application in food and pharmaceutical industries as natural valuable products. The extraction of the Thymus vulgaris L. essential oil (TVEO) was obtained by steam distillation. Chemical composition of the TVEO was determined by Gas Chromatography. A total of thirteen compounds were identified. Carvacrol (83.8%) was the major component, followed by cymene (8.15%) and terpinene (4.96%). Antibacterial action of the TVEO against 23 clinically isolated bacterial strains was determined by using agar disc diffusion and vapour diffusion methods at different doses. By disc diffusion method, TVEO showed potent antimicrobial activity against gram-positive bacteria more than gram-negative strains and antibiotic discs. The Diameter of Inhibition Zone (DIZ) varied from 25 to 60 mm for S. aureus, B. subtilisand E. coli. However, the results obtained by both agar diffusion and vapour diffusion methods were different. Significantly higher antibacterial effect was observed in the vapour phase at lower doses. S. aureus and B. subtilis were the most susceptible strains to the oil vapour. Therefore, smaller doses of EO in the vapour phase can be inhibitory to pathogenic bacteria. There is growing evidence that TVEO in vapour phase are effective antiseptic systems and appears worthy to be considered for practical uses in the treatment of human infections oras air decontaminants in hospital. TVEO has considerable antibacterial activity deserving further investigation for clinical applications. Also whilst the mode of action remains mainly undetermined, this experimental approach will need to continue.Keywords: antimicrobial drugs, carvacrol, disc diffusion, Thymus vulgaris, vapour diffusion
Procedia PDF Downloads 3731666 Micromechanical Modelling of Ductile Damage with a Cohesive-Volumetric Approach
Authors: Noe Brice Nkoumbou Kaptchouang, Pierre-Guy Vincent, Yann Monerie
Abstract:
The present work addresses the modelling and the simulation of crack initiation and propagation in ductile materials which failed by void nucleation, growth, and coalescence. One of the current research frameworks on crack propagation is the use of cohesive-volumetric approach where the crack growth is modelled as a decohesion of two surfaces in a continuum material. In this framework, the material behavior is characterized by two constitutive relations, the volumetric constitutive law relating stress and strain, and a traction-separation law across a two-dimensional surface embedded in the three-dimensional continuum. Several cohesive models have been proposed for the simulation of crack growth in brittle materials. On the other hand, the application of cohesive models in modelling crack growth in ductile material is still a relatively open field. One idea developed in the literature is to identify the traction separation for ductile material based on the behavior of a continuously-deforming unit cell failing by void growth and coalescence. Following this method, the present study proposed a semi-analytical cohesive model for ductile material based on a micromechanical approach. The strain localization band prior to ductile failure is modelled as a cohesive band, and the Gurson-Tvergaard-Needleman plasticity model (GTN) is used to model the behavior of the cohesive band and derived a corresponding traction separation law. The numerical implementation of the model is realized using the non-smooth contact method (NSCD) where cohesive models are introduced as mixed boundary conditions between each volumetric finite element. The present approach is applied to the simulation of crack growth in nuclear ferritic steel. The model provides an alternative way to simulate crack propagation using the numerical efficiency of cohesive model with a traction separation law directly derived from porous continuous model.Keywords: ductile failure, cohesive model, GTN model, numerical simulation
Procedia PDF Downloads 1491665 Building Atmospheric Moisture Diagnostics: Environmental Monitoring and Data Collection
Authors: Paula Lopez-Arce, Hector Altamirano, Dimitrios Rovas, James Berry, Bryan Hindle, Steven Hodgson
Abstract:
Efficient mould remediation and accurate moisture diagnostics leading to condensation and mould growth in dwellings are largely untapped. Number of factors are contributing to the rising trend of excessive moisture in homes mainly linked with modern living, increased levels of occupation and rising fuel costs, as well as making homes more energy efficient. Environmental monitoring by means of data collection though loggers sensors and survey forms has been performed in a range of buildings from different UK regions. Air and surface temperature and relative humidity values of residential areas affected by condensation and/or mould issues were recorded. Additional measurements were taken through different trials changing type, location, and position of loggers. In some instances, IR thermal images and ventilation rates have also been acquired. Results have been interpreted together with environmental key parameters by processing and connecting data from loggers and survey questionnaires, both in buildings with and without moisture issues. Monitoring exercises carried out during Winter and Spring time show the importance of developing and following accurate protocols for guidance to obtain consistent, repeatable and comparable results and to improve the performance of environmental monitoring. A model and a protocol are being developed to build a diagnostic tool with the goal of performing a simple but precise residential atmospheric moisture diagnostics to distinguish the cause entailing condensation and mould generation, i.e., ventilation, insulation or heating systems issue. This research shows the relevance of monitoring and processing environmental data to assign moisture risk levels and determine the origin of condensation or mould when dealing with a building atmospheric moisture excess.Keywords: environmental monitoring, atmospheric moisture, protocols, mould
Procedia PDF Downloads 1391664 Impact of Locally Synthesized Carbon Nanotubes against Some Local Clinical Bacterial Isolates
Authors: Abdul Matin, Muazzama Akhtar, Shahid Nisar, Saddaf Mazzar, Umer Rashid
Abstract:
Antibiotic resistance is an increasing concern worldwide now a day. Neisseria gonorrhea and Staphylococcus aureus are known to cause major human sexually transmitted and respiratory diseases respectively. Nanotechnology is an emerging discipline and its application in various fields especially in medical sciences is gigantic. In the present study, we synthesized multi-walled carbon nanotubes (MWNTs) using acid oxidation method and solubilized MWNTs were with length predominantly >500 nm and diameters ranging from 40 to 50 nm. The locally synthesized MWNTs were used against gram positive and negative bacteria to determine their impact on bacterial growth. Clinical isolates of Neisseria gonorrhea (isolate: 4C-11) and Staphylococcus aureus (isolate: 38541) were obtained from local hospital and normally cultured in LB broth at 37°C. Both clinical strains can be obtained on request from University of Gujarat. Spectophometric assay was performed to determine the impact of MWNTs on bacterial growth in vitro. To determine the effect of MWTNs on test organisms, various concentration of MWNTs were used and recorded observation on various time intervals to understand the growth inhibition pattern. Our results demonstrated that MWNTs exhibited toxic effects to Staphylococcus aureus while showed very limited growth inhibition to Neisseria gonorrhea, which suggests the resistant potential of Neisseria against nanoparticles. Our results clearly demonstrate the gradual decrease in bacterial numbers with passage of time when compared with control. Maximum bacterial inhibition was observed at maximum concentration (50 µg/ml). Our future work will include further characterization and mode of action of our locally synthesized MWNTs. In conclusion, we investigated and reported for the first time the inhibitory potential of locally synthesized MWNTs on local clinical isolates of Staphylococcus aureus and Neisseria gonorrhea.Keywords: antibacterial activity, multi walled carbon nanotubes, Neisseria gonorrhea, spectrophotometer assay, Staphylococcus aureus
Procedia PDF Downloads 3141663 Optimal Tamping for Railway Tracks, Reducing Railway Maintenance Expenditures by the Use of Integer Programming
Authors: Rui Li, Min Wen, Kim Bang Salling
Abstract:
For the modern railways, maintenance is critical for ensuring safety, train punctuality and overall capacity utilization. The cost of railway maintenance in Europe is high, on average between 30,000 – 100,000 Euros per kilometer per year. In order to reduce such maintenance expenditures, this paper presents a mixed 0-1 linear mathematical model designed to optimize the predictive railway tamping activities for ballast track in the planning horizon of three to four years. The objective function is to minimize the tamping machine actual costs. The approach of the research is using the simple dynamic model for modelling condition-based tamping process and the solution method for finding optimal condition-based tamping schedule. Seven technical and practical aspects are taken into account to schedule tamping: (1) track degradation of the standard deviation of the longitudinal level over time; (2) track geometrical alignment; (3) track quality thresholds based on the train speed limits; (4) the dependency of the track quality recovery on the track quality after tamping operation; (5) Tamping machine operation practices (6) tamping budgets and (7) differentiating the open track from the station sections. A Danish railway track between Odense and Fredericia with 42.6 km of length is applied for a time period of three and four years in the proposed maintenance model. The generated tamping schedule is reasonable and robust. Based on the result from the Danish railway corridor, the total costs can be reduced significantly (50%) than the previous model which is based on optimizing the number of tamping. The different maintenance strategies have been discussed in the paper. The analysis from the results obtained from the model also shows a longer period of predictive tamping planning has more optimal scheduling of maintenance actions than continuous short term preventive maintenance, namely yearly condition-based planning.Keywords: integer programming, railway tamping, predictive maintenance model, preventive condition-based maintenance
Procedia PDF Downloads 4441662 Effect of Different Thermomechanical Cycles on Microstructure of AISI 4140 Steel
Authors: L.L. Costa, A. M. G. Brito, S. Khan, L. Schaeffer
Abstract:
Microstructure resulting from the forging process is studied as a function of variables such as temperature, deformation, austenite grain size and cooling rate. The purpose of this work is to study the thermomechanical behavior of DIN 42CrMo4 (AISI 4140) steel maintained at the temperatures of 900°, 1000°, 1100° and 1200°C for the austenization times of 22, 66 and 200 minutes each and subsequently forged. These samples were quenched in water in order to study the austenite grain and to investigate the microstructure instead of quenching the annealed samples after forging they were cooled down naturally in the air. The morphologies and properties of the materials such as hardness; prepared by these two different routes have been compared. In addition to the forging experiments, the numerical simulation using the finite element model (FEM), microhardness profiles and metallography images have been presented. Forging force vs position curves has been compared with metallographic results for each annealing condition. The microstructural phenomena resulting from the hot conformation proved that longer austenization time and higher temperature decrease the forging force in the curves. The complete recrystallization phenomenon (static, dynamic and meta dynamic) was observed at the highest temperature and longest time i.e., the samples austenized for 200 minutes at 1200ºC. However, higher hardness of the quenched samples was obtained when the temperature was 900ºC for 66 minutes. The phases observed in naturally cooled samples were exclusively ferrite and perlite, but the continuous cooling diagram indicates the presence of austenite and bainite. The morphology of the phases of naturally cooled samples has shown that the phase arrangement and the previous austenitic grain size are the reasons to high hardness in obtained samples when temperature were 900ºC and 1100ºC austenization times of 22 and 66 minutes, respectively.Keywords: austenization time, thermomechanical effects, forging process, steel AISI 4140
Procedia PDF Downloads 1451661 A Study of the Interactions between the Inter-City Traffic System and the Spatial Structure Evolution in the Yangtze River Delta from Time and Space Dimensions
Authors: Zhang Cong, Cai Runlin, Jia Fengjiao
Abstract:
The evolution of the urban agglomeration spatial structure requires strong support of the inter-city traffic system. And the inter-city traffic system can not only meet the demand of the urban agglomeration transportation but also guide the economic development. To correctly understand the relationship between inter-city traffic planning and urban agglomeration can help the urban agglomeration coordinated developing with the inter-city traffic system. The Yangtze River Delta is one of the most representative urban agglomerations in China with strong economic vitality, high city levels, diversified urban space form, and improved transport infrastructure. With the promotion of industrial division in the Yangtze River Delta and the regional travel facilitation brought by inter-city traffic, the urban agglomeration is characterized by highly increasing of inter-city transportation demand, the urbanization of regional traffic, adjacent regional transportation links breaking administrative boundaries, the networked channels and so on. Therefore, the development of inter-city traffic system presents new trends and challenges. This paper studies the interactions between inter-city traffic system and regional economic growth, regional factor flow, and regional spatial structure evolution in the Yangtze River Delta from two dimensions of time and space. On this basis, the adaptability of inter-city traffic development mode and urban agglomeration space structure is analyzed. First of all, the coordination between urban agglomeration planning and inter-city traffic planning is judged from the planning level. Secondly, the coordination between inter-city traffic elements and industries and population distributions is judged from the perspective of space. Finally, the coordination of the cross-regional planning and construction of inter-city traffic system is judged. The conclusions can provide an empirical reference for intercity traffic planning in Yangtze River Delta region and other urban agglomerations, and it is also of great significance to optimize the allocation of urban agglomerations and the overall operational efficiency.Keywords: evolution, interaction, inter-city traffic system, spatial structure
Procedia PDF Downloads 3111660 Size Optimization of Microfluidic Polymerase Chain Reaction Devices Using COMSOL
Authors: Foteini Zagklavara, Peter Jimack, Nikil Kapur, Ozz Querin, Harvey Thompson
Abstract:
The invention and development of the Polymerase Chain Reaction (PCR) technology have revolutionised molecular biology and molecular diagnostics. There is an urgent need to optimise their performance of those devices while reducing the total construction and operation costs. The present study proposes a CFD-enabled optimisation methodology for continuous flow (CF) PCR devices with serpentine-channel structure, which enables the trade-offs between competing objectives of DNA amplification efficiency and pressure drop to be explored. This is achieved by using a surrogate-enabled optimisation approach accounting for the geometrical features of a CF μPCR device by performing a series of simulations at a relatively small number of Design of Experiments (DoE) points, with the use of COMSOL Multiphysics 5.4. The values of the objectives are extracted from the CFD solutions, and response surfaces created using the polyharmonic splines and neural networks. After creating the respective response surfaces, genetic algorithm, and a multi-level coordinate search optimisation function are used to locate the optimum design parameters. Both optimisation methods produced similar results for both the neural network and the polyharmonic spline response surfaces. The results indicate that there is the possibility of improving the DNA efficiency by ∼2% in one PCR cycle when doubling the width of the microchannel to 400 μm while maintaining the height at the value of the original design (50μm). Moreover, the increase in the width of the serpentine microchannel is combined with a decrease in its total length in order to obtain the same residence times in all the simulations, resulting in a smaller total substrate volume (32.94% decrease). A multi-objective optimisation is also performed with the use of a Pareto Front plot. Such knowledge will enable designers to maximise the amount of DNA amplified or to minimise the time taken throughout thermal cycling in such devices.Keywords: PCR, optimisation, microfluidics, COMSOL
Procedia PDF Downloads 1611659 A Case Study on Performance of Isolated Bridges under Near-Fault Ground Motion
Authors: Daniele Losanno, H. A. Hadad, Giorgio Serino
Abstract:
This paper presents a numerical investigation on the seismic performance of a benchmark bridge with different optimal isolation systems under near fault ground motion. Usually, very large displacements make seismic isolation an unfeasible solution due to boundary conditions, especially in case of existing bridges or high risk seismic regions. Hence, near-fault ground motions are most likely to affect either structures with long natural period range like isolated structures or structures sensitive to velocity content such as viscously damped structures. The work is aimed at analyzing the seismic performance of a three-span continuous bridge designed with different isolation systems having different levels of damping. The case study was analyzed in different configurations including: (a) simply supported, (b) isolated with lead rubber bearings (LRBs), (c) isolated with rubber isolators and 10% classical damping (HDLRBs), and (d) isolated with rubber isolators and 70% supplemental damping ratio. Case (d) represents an alternative control strategy that combines the effect of seismic isolation with additional supplemental damping trying to take advantages from both solutions. The bridge is modeled in SAP2000 and solved by time history direct-integration analyses under a set of six recorded near-fault ground motions. In addition to this, a set of analysis under Italian code provided seismic action is also conducted, in order to evaluate the effectiveness of the suggested optimal control strategies under far field seismic action. Results of the analysis demonstrated that an isolated bridge equipped with HDLRBs and a total equivalent damping ratio of 70% represents a very effective design solution for both mitigation of displacement demand at the isolation level and base shear reduction in the piers also in case of near fault ground motion.Keywords: isolated bridges, near-fault motion, seismic response, supplemental damping, optimal design
Procedia PDF Downloads 2861658 [Keynote Talk]: Knowledge Codification and Innovation Success within Digital Platforms
Authors: Wissal Ben Arfi, Lubica Hikkerova, Jean-Michel Sahut
Abstract:
This study examines interfirm networks in the digital transformation era, and in particular, how tacit knowledge codification affects innovation success within digital platforms. Hence, one of the most important features of digital transformation and innovation process outcomes is the emergence of digital platforms, as an interfirm network, at the heart of open innovation. This research aims to illuminate how digital platforms influence inter-organizational innovation through virtual team interactions and knowledge sharing practices within an interfirm network. Consequently, it contributes to the respective strategic management literature on new product development (NPD), open innovation, industrial management, and its emerging interfirm networks’ management. The empirical findings show, on the one hand, that knowledge conversion may be enhanced, especially by the socialization which seems to be the most important phase as it has played a crucial role to hold the virtual team members together. On the other hand, in the process of socialization, the tacit knowledge codification is crucial because it provides the structure needed for the interfirm network actors to interact and act to reach common goals which favor the emergence of open innovation. Finally, our results offer several conditions necessary, but not always sufficient, for interfirm managers involved in NPD and innovation concerning strategies to increasingly shape interconnected and borderless markets and business collaborations. In the digital transformation era, the need for adaptive and innovative business models as well as new and flexible network forms is becoming more significant than ever. Supported by technological advancements and digital platforms, companies could benefit from increased market opportunities and creating new markets for their innovations through alliances and collaborative strategies, as a mode of reducing or eliminating uncertainty environments or entry barriers. Consequently, an efficient and well-structured interfirm network is essential to create network capabilities, to ensure tacit knowledge sharing, to enhance organizational learning and to foster open innovation success within digital platforms.Keywords: interfirm networks, digital platform, virtual teams, open innovation, knowledge sharing
Procedia PDF Downloads 1301657 Viscoelastic Characterization of Gelatin/Cellulose Nanocrystals Aqueous Bionanocomposites
Authors: Liliane Samara Ferreira Leite, Francys Kley Vieira Moreira, Luiz Henrique Capparelli Mattoso
Abstract:
The increasing environmental concern regarding the plastic pollution worldwide has stimulated the development of low-cost biodegradable materials. Proteins are renewable feedstocks that could be used to produce biodegradable plastics. Gelatin, for example, is a cheap film-forming protein extracted from animal skin and connective tissues of Brazilian Livestock residues; thus it has a good potential in low-cost biodegradable plastic production. However, gelatin plastics are limited in terms of mechanical and barrier properties. Cellulose nanocrystals (CNC) are efficient nanofillers that have been used to extend physical properties of polymers. This work was aimed at evaluating the reinforcing efficiency of CNC on gelatin films. Specifically, we have employed the continuous casting as the processing method for obtaining the gelatin/CNC bionanocomposites. This required a first rheological study for assessing the effect of gelatin-CNC and CNC-CNC interactions on the colloidal state of the aqueous bionanocomposite formulations. CNC were isolated from eucalyptus pulp by sulfuric acid hydrolysis (65 wt%) at 55 °C for 30 min. Gelatin was solubilized in ultra-pure water at 85°C for 20 min and then mixed with glycerol at 20 wt.% and CNC at 0.5 wt%, 1.0 wt% and 2.5 wt%. Rotational measurements were performed to determine linear viscosity (η) of bionanocomposite solutions, which increased with increasing CNC content. At 2.5 wt% CNC, η increased by 118% regarding the neat gelatin solution, which was ascribed to percolation CNC network formation. Storage modulus (G’) and loss modulus (G″) further determined by oscillatory tests revealed that a gel-like behavior was dominant in the bionanocomposite solutions (G’ > G’’) over a broad range of temperature (20 – 85 °C), particularly at 2.5 wt% CNC. These results confirm effective interactions in the aqueous gelatin-CNC bionanocomposites that could substantially increase the physical properties of the gelatin plastics. Tensile tests are underway to confirm this hypothesis. The authors would like to thank the Fapesp (process n 2016/03080-3) for support.Keywords: bionanocomposites, cellulose nanocrystals, gelatin, viscoelastic characterization
Procedia PDF Downloads 1501656 Brain Connectome of Glia, Axons, and Neurons: Cognitive Model of Analogy
Authors: Ozgu Hafizoglu
Abstract:
An analogy is an essential tool of human cognition that enables connecting diffuse and diverse systems with physical, behavioral, principal relations that are essential to learning, discovery, and innovation. The Cognitive Model of Analogy (CMA) leads and creates patterns of pathways to transfer information within and between domains in science, just as happens in the brain. The connectome of the brain shows how the brain operates with mental leaps between domains and mental hops within domains and the way how analogical reasoning mechanism operates. This paper demonstrates the CMA as an evolutionary approach to science, technology, and life. The model puts forward the challenges of deep uncertainty about the future, emphasizing the need for flexibility of the system in order to enable reasoning methodology to adapt to changing conditions in the new era, especially post-pandemic. In this paper, we will reveal how to draw an analogy to scientific research to discover new systems that reveal the fractal schema of analogical reasoning within and between the systems like within and between the brain regions. Distinct phases of the problem-solving processes are divided thusly: stimulus, encoding, mapping, inference, and response. Based on the brain research so far, the system is revealed to be relevant to brain activation considering each of these phases with an emphasis on achieving a better visualization of the brain’s mechanism in macro context; brain and spinal cord, and micro context: glia and neurons, relative to matching conditions of analogical reasoning and relational information, encoding, mapping, inference and response processes, and verification of perceptual responses in four-term analogical reasoning. Finally, we will relate all these terminologies with these mental leaps, mental maps, mental hops, and mental loops to make the mental model of CMA clear.Keywords: analogy, analogical reasoning, brain connectome, cognitive model, neurons and glia, mental leaps, mental hops, mental loops
Procedia PDF Downloads 1651655 The Effect of Primary Treatment on Histopathological Patterns and Choice of Neck Dissection in Regional Failure of Nasopharyngeal Carcinoma Patients
Authors: Ralene Sim, Stefan Mueller, N. Gopalakrishna Iyer, Ngian Chye Tan, Khee Chee Soo, R. Shetty Mahalakshmi, Hiang Khoon Tan
Abstract:
Background: Regional failure in nasopharyngeal carcinoma (NPC) is managed by salvage treatment in the form of neck dissection. Radical neck dissection (RND) is preferred over modified radical neck dissection (MRND) since it is traditionally believed to offer better long-term disease control. However, with the advent of more advanced imaging modalities like high-resolution Magnetic Resonance Imaging, Computed Tomography, and Positron Emission Tomography-CT scans, earlier detection is achieved. Additionally, concurrent chemotherapy also contributes to reduced tumour burden. Hence, there may be a lesser need for an RND and a greater role for MRND. With this retrospective study, the primary aim is to ascertain whether MRND, as opposed to RND, has similar outcomes and hence, whether there would be more grounds to offer a less aggressive procedure to achieve lower patient morbidity. Methods: This is a retrospective study of 66 NPC patients treated at Singapore General Hospital between 1994 to 2016 for histologically proven regional recurrence, of which 41 patients underwent RND and 25 who underwent MRND, based on surgeon preference. The type of ND performed, primary treatment mode, adjuvant treatment, and pattern of recurrence were reviewed. Overall survival (OS) was calculated using Kaplan-Meier estimate and compared. Results: Overall, the disease parameters such as nodal involvement and extranodal extension were comparable between the two groups. Comparing MRND and RND, the median (IQR) OS is 1.76 (0.58 to 3.49) and 2.41 (0.78 to 4.11) respectively. However, the p-value found is 0.5301 and hence not statistically significant. Conclusion: RND is more aggressive and has been associated with greater morbidity. Hence, with similar outcomes, MRND could be an alternative salvage procedure for regional failure in selected NPC patients, allowing similar salvage rates with lesser mortality and morbidity.Keywords: nasopharyngeal carcinoma, neck dissection, modified neck dissection, radical neck dissection
Procedia PDF Downloads 1701654 Investigation of Input Energy Efficiency in Corn (KSC704) Farming in Khoy City, Iran
Authors: Nasser Hosseini
Abstract:
Energy cycle is one of the essential points in agricultural ecosystems all over the world. Corn is one of the important products in Khoy city. Knowing input energy level and evaluating output energy from farms to reduce energy and increase efficiency in farms is very important if one can reduce input energy level into farms through the indices like poisons, fertilization, tractor energy and labour force. In addition to the net income of the farmers, this issue would play a significant role in preserving farm ecosystem from pollution and wrecker factors. For this reason, energy balance sheet in corn farms as well as input and output energy in 2012-2013 were researched by distributing a questionnaire among farmers in various villages in Khoy city. Then, the input energy amount into farms via energy-consuming factors, mentioned above, with regard to special coefficients was computed. Energy was computed on the basis of seed corn function, chemical compound and its content as well. In this investigation, we evaluated the level of stored energy 10792831 kcal per hectare. We found out that the greatest part of energy depended on irrigation which has 5136141.8 kcal and nitrate fertilizer energy with 2509760 kcal and the lowest part of energy depended on phosphor fertilizer, the rate of posited energy equaled 36362500 kcal and energy efficiency on the basis of seed corn function were estimated as 3.36. We found some ways to reduce consumptive energy in farm and nitrate fertilizer and, on the other hand, to increase balance sheet. They are, to name a few, using alternative farming and potherbs for biological stabilizing of nitrogen and changing kind of fertilizers such as urea fertilizer with sulphur cover, and using new generation of irrigation, the compound of water super absorbent like colored hydrogels and using natural fertilizer to preserve.Keywords: corn (KSC704), output and input, energy efficiency, Khoy city
Procedia PDF Downloads 4411653 The Optimal Utilization of Centrally Located Land: The Case of the Bloemfontein Show Grounds
Authors: D. F. Coetzee, M. M. Campbell
Abstract:
The urban environment is constantly expanding and the optimal use of centrally located land is important in terms of sustainable development. Bloemfontein has expanded and this affects land-use functions. The purpose of the study is to examine the possible shift in location of the Bloemfontein show grounds to utilize the space of the grounds more effectively in context of spatial planning. The research method used is qualitative case study research with the case study on the Bloemfontein show grounds. The purposive sample consisted of planners who work or consult in the Bloemfontein area and who are registered with the South African Council for Planners (SACPLAN). Interviews consisting of qualitative open-ended questionnaires were used. When considering relocation the social and economic aspects need to be considered. The findings also indicated a majority consensus that the property can be utilized more effectively in terms of mixed land use. The showground development trust compiled a master plan to ensure that the property is used to its full potential without the relocation of the showground function itself. This Master Plan can be seen as the next logical step for the showground property itself, and it is indeed an attempt to better utilize the land parcel without relocating the show function. The question arises whether the proposed Master Plan is a permanent solution or whether it is merely delaying the relocation of the core showground function to another location. For now, it is a sound solution, making the best out of the situation at hand and utilizing the property more effectively. If the show grounds were to be relocated the researcher proposed a recommendation of mixed-use development, in terms an expansion on the commercial business/retail, together with a sport and recreation function. The show grounds in Bloemfontein are well positioned to capitalize on and to meet the needs of the changing economy, while complimenting the future economic growth strategies of the city if the right plans are in place.Keywords: centrally located land, spatial planning, show grounds, central business district
Procedia PDF Downloads 4141652 Price Compensation Mechanism with Unmet Demand for Public-Private Partnership Projects
Abstract:
Public-private partnership (PPP), as an innovative way to provide infrastructures by the private sector, is being widely used throughout the world. Compared with the traditional mode, PPP emerges largely for merits of relieving public budget constraint and improving infrastructure supply efficiency by involving private funds. However, PPP projects are characterized by large scale, high investment, long payback period, and long concession period. These characteristics make PPP projects full of risks. One of the most important risks faced by the private sector is demand risk because many factors affect the real demand. If the real demand is far lower than the forecasting demand, the private sector will be got into big trouble because operating revenue is the main means for the private sector to recoup the investment and obtain profit. Therefore, it is important to study how the government compensates the private sector when the demand risk occurs in order to achieve Pareto-improvement. This research focuses on price compensation mechanism, an ex-post compensation mechanism, and analyzes, by mathematical modeling, the impact of price compensation mechanism on payoff of the private sector and consumer surplus for PPP toll road projects. This research first investigates whether or not price compensation mechanisms can obtain Pareto-improvement and, if so, then explores boundary conditions for this mechanism. The research results show that price compensation mechanism can realize Pareto-improvement under certain conditions. Especially, to make the price compensation mechanism accomplish Pareto-improvement, renegotiation costs of the government and the private sector should be lower than a certain threshold which is determined by marginal operating cost and distortionary cost of the tax. In addition, the compensation percentage should match with the price cut of the private investor when demand drops. This research aims to provide theoretical support for the government when determining compensation scope under the price compensation mechanism. Moreover, some policy implications can also be drawn from the analysis for better risk-sharing and sustainability of PPP projects.Keywords: infrastructure, price compensation mechanism, public-private partnership, renegotiation
Procedia PDF Downloads 1791651 Yields and Composition of the Gas, Liquid and Solid Fractions Obtained by Conventional Pyrolysis of Different Lignocellulosic Biomass Residues
Authors: María del Carmen Recio-Ruiz, Ramiro Ruiz-Rosas, Juana María Rosas, José Rodríguez-Mirasol, Tomás Cordero
Abstract:
Nowadays, fossil resources are main precursors for fuel production. Due to their contribution to the greenhouse effect and their future depletion, there is a constant search for environmentally friendly feedstock alternatives. Biomass residues constitute an interesting replacement for fossil resources because of their zero net CO₂ emissions. One of the main routes to convert biomass into energy and chemicals is pyrolysis. In this work, conventional pyrolysis of different biomass residues highly available such as almond shells, hemp hurds, olive stones, and Kraft lignin, was studied. In a typical experiment, the biomass was crushed and loaded into a fixed bed reactor under continuous nitrogen flow. The influence of temperature (400-800 ºC) and heating rate (10 and 20 ºC/min) on the pyrolysis yield and composition of the different fractions has been studied. In every case, the mass yields revealed that the solid fraction decreased with temperature, while liquid and gas fractions increased due to depolymerization and cracking reactions at high temperatures. The composition of every pyrolysis fraction was studied in detail. The results showed that the composition of the gas fraction was mainly CO, CO₂ when working at low temperatures, and mostly CH₄ and H₂at high temperatures. The solid fraction developed an incipient microporosity, with narrow micropore volume of 0.21 cm³/g. Regarding the liquid fraction, pyrolysis of almond shell, hemp hurds, and olive stones led mainly to a high content in aliphatic acids and furans, due to the high volatile matter content of these biomass (>74 %wt.), and phenols to a lesser degree, which were formed due to the degradation of lignin at higher temperatures. However, when Kraft lignin was used as bio-oil precursor, the presence of phenols was very prominent, and aliphatic compounds were also detected in a lesser extent.Keywords: Bio-oil, biomass, conventional pyrolysis, lignocellulosic
Procedia PDF Downloads 1341650 Factors Promoting French-English Tweets in France
Authors: Taoues Hadour
Abstract:
Twitter has become a popular means of communication used in a variety of fields, such as politics, journalism, and academia. This widely used online platform has an impact on the way people express themselves and is changing language usage worldwide at an unprecedented pace. The language used online reflects the linguistic battle that has been going on for several decades in French society. This study enables a deeper understanding of users' linguistic behavior online. The implications are important and allow for a rise in awareness of intercultural and cross-language exchanges. This project investigates the mixing of French-English language usage among French users of Twitter using a topic analysis approach. This analysis draws on Gumperz's theory of conversational switching. In order to collect tweets at a large scale, the data was collected in R using the rtweet package to access and retrieve French tweets data through Twitter’s REST and stream APIs (Application Program Interface) using the software RStudio, the integrated development environment for R. The dataset was filtered manually and certain repetitions of themes were observed. A total of nine topic categories were identified and analyzed in this study: entertainment, internet/social media, events/community, politics/news, sports, sex/pornography, innovation/technology, fashion/make up, and business. The study reveals that entertainment is the most frequent topic discussed on Twitter. Entertainment includes movies, music, games, and books. Anglicisms such as trailer, spoil, and live are identified in the data. Change in language usage is inevitable and is a natural result of linguistic interactions. The use of different languages online is just an example of what the real world would look like without linguistic regulations. Social media reveals a multicultural and multilinguistic richness which can deepen and expand our understanding of contemporary human attitudes.Keywords: code-switching, French, sociolinguistics, Twitter
Procedia PDF Downloads 1371649 Effect of Temperature and Deformation Mode on Texture Evolution of AA6061
Authors: M. Ghosh, A. Miroux, L. A. I. Kestens
Abstract:
At molecular or micrometre scale, practically all materials are neither homogeneous nor isotropic. The concept of texture is used to identify the structural features that cause the properties of a material to be anisotropic. For metallic materials, the anisotropy of the mechanical behaviour originates from the crystallographic nature of plastic deformation, and is therefore controlled by the crystallographic texture. Anisotropy in mechanical properties often constitutes a disadvantage in the application of materials, as it is often illustrated by the earing phenomena during drawing. However, advantages may also be attained when considering other properties (e.g. optimization of magnetic behaviour to a specific direction) by controlling texture through thermo-mechanical processing). Nevertheless, in order to have better control over the final properties it is essential to relate texture with materials processing route and subsequently optimise their performance. However, up to date, few studies have been reported about the evolution of texture in 6061 aluminium alloy during warm processing (from room temperature to 250ºC). In present investigation, recrystallized 6061 aluminium alloy samples were subjected to tensile and plane strain compression (PSC) at room and warm temperatures. The gradual change of texture following both deformation modes were measured and discussed. Tensile tests demonstrate the mechanism at low strain while PSC does the same at high strain and eventually simulate the condition of rolling. Cube dominated texture of the initial rolled and recrystallized AA6061 sheets were replaced by domination of S and R components after PSC at room temperature, warm temperature (250ºC) though did not reflect any noticeable deviation from room temperature observation. It was also noticed that temperature has no significant effect on the evolution of grain morphology during PSC. The band contrast map revealed that after 30% deformation the substructure inside the grain is mainly made of series of parallel bands. A tendency for decrease of Cube and increase of Goss was noticed after tensile deformation compared to as-received material. Like PSC, texture does not change after deformation at warm temperature though. n-fibre was noticed for all the three textures from Goss to Cube.Keywords: AA 6061, deformation, temperature, tensile, PSC, texture
Procedia PDF Downloads 4851648 Workplace Development Programmes for Small and Medium-Sized Enterprises in Europe and Singapore: A Conceptual Study
Authors: Zhan Jie How
Abstract:
With the heightened awareness of workplace learning and its impact on improving organizational performance and developing employee competence, governments and corporations around the world are forced to intensify their cooperation to establish national workplace development programmes to guide these corporations in fostering engaging and collaborative workplace learning cultures. This conceptual paper aims to conduct a comparative study of existing workplace development programmes for small and medium-sized enterprises (SMEs) in Europe and Singapore, focusing primarily on the Swedish Production Leap, Finnish TEKES Liideri Programme, and Singapore SkillsFuture SME Mentors Programme. The study carries out a systematic review of the three workplace development programmes to examine the roles of external mentors or coaches in influencing the design and implementation of workplace learning strategies and practices in SMEs. Organizational, personal and external factors that promote or inhibit effective workplace mentorship are also scrutinized, culminating in a critical comparison and evaluation of the strengths and weaknesses of the aforementioned programmes. Based on the findings from the review and analyses, a heuristic conceptual framework is developed to illustrate the complex interrelationships among external workplace development programmes, internal learning and development initiatives instituted by the organization’s higher management, and employees' continuous learning activities at the workplace. The framework also includes a set of guiding principles that can be used as the basis for internal mediation between the competing perspectives of mentors and mentees (employers and employees of the organization) regarding workplace learning conditions, practices and their intended impact on the organization. The conceptual study provides a theoretical blueprint for future empirical research on organizational workplace learning and the impact of government-initiated workplace development programmes.Keywords: employee competence, mentorship, organizational performance, workplace development programme, workplace learning culture
Procedia PDF Downloads 1411647 Examining Patterns in Ethnoracial Diversity in Los Angeles County Neighborhoods, 2016, Using Geographic Information System Analysis and Entropy Measure of Diversity
Authors: Joseph F. Cabrera, Rachael Dela Cruz
Abstract:
This study specifically examines patterns that define ethnoracially diverse neighborhoods. Ethnoracial diversity is important as it facilitates cross-racial interactions within neighborhoods which have been theorized to be associated with such outcomes as intergroup harmony, the reduction of racial and ethnic prejudice and discrimination, and increases in racial tolerance. Los Angeles (LA) is an ideal location to study ethnoracial spatial patterns as it is one of the most ethnoracially diverse cities in the world. A large influx of Latinos, as well as Asians, have contributed to LA’s urban landscape becoming increasingly diverse over several decades. Our dataset contains all census tracts in Los Angeles County in 2016 and incorporates Census and ACS demographic and spatial data. We quantify ethnoracial diversity using a derivative of Simpson’s Diversity Index and utilize this measure to test previous literature that suggests Latinos are one of the key drivers of changing ethnoracial spatial patterns in Los Angeles. Preliminary results suggest that there has been an overall increase in ethnoracial diversity in Los Angeles neighborhoods over the past sixteen years. Patterns associated with this trend include decreases in predominantly white and black neighborhoods, increases in predominantly Latino and Asian neighborhoods, and a general decrease in the white populations of the most diverse neighborhoods. A similar pattern is seen in neighborhoods with large Latino increases- a decrease in white population, but with an increase in Asian and black populations. We also found support for previous research that suggests increases in Latino and Asian populations act as a buffer, allowing for black population increases without a sizeable decrease in the white population. Future research is needed to understand the underlying causes involved in many of the patterns and trends highlighted in this study.Keywords: race, race and interaction, racial harmony, social interaction
Procedia PDF Downloads 1321646 Exercise Intensity Increasing Appetite, Energy, Intake Energy Expenditure, and Fat Oxidation in Sedentary Overweight Individuals
Authors: Ghalia Shamlan, M. Denise Robertson, Adam Collins
Abstract:
Appetite control (i.e. control of energy intake) is important for weight maintenance. Exercise contributes to the most variable component of energy expenditure (EE) but its impact is beyond the energy cost of exercise including physiological, behavioural, and appetite effects. Exercise is known to acutely influence effect appetite but evidence as to the independent effect of intensity is lacking. This study investigated the role of exercise intensity on appetite, energy intake (EI), appetite related hormone, fat utilisation and subjective measures of appetite. One hour after a standardised breakfast, 10 sedentary overweight volunteers. Subjects undertook either 8 repeated 60 second bouts of cycling at 95% VO2max (high intensity) or 30 minutes of continuous cycling, at a fixed cadence, equivalent to 50% of the participant’s VO2max (low intensity) in a randomised crossover design. Glucose, NEFA, glucagon-like peptide-1 (GLP-1) were measured fasted, postprandial, and pre and post-exercise. Satiety was assessed subjectively throughout the study using visual analogue scales (VAS). Ad libitum intake of a pasta meal was measured at the end (3-h post-breakfast). Interestingly, there was not significant difference in EE fat oxidation between HI and LI post-exercise. Also, no significant effect of high intensity (HI) was observed on the ad libitum meal, 24h and 48h EI post-exercise. However the mean 24h EI was 3000 KJ lower following HI than low intensity (LI). Despite, no significant differences in hunger score, glucose, NEFA and GLP-1 between both intensities were observed. However, NEFA and GLP-1 plasma level were higher until 30 min post LI. In conclusion, the similarity of EE and oxidation outcomes could give overweight individuals an option to choose between intensities. However, HI could help to reduce EI. There are mechanisms and consequences of exercise in short and long-term appetite control; however, these mechanisms warrant further explanation. These results support the need for future research in to the role of in regulation energy balance, especially for obese people.Keywords: appetite, exercise, food intake, energy expenditure
Procedia PDF Downloads 5061645 Performance and Nutritional Evaluation of Moringa Leaves Dried in a Solar-Assisted Heat Pump Dryer Integrated with Thermal Energy Storage
Authors: Aldé Belgard Tchicaya Loemba, Baraka Kichonge, Thomas Kivevele, Juma Rajabu Selemani
Abstract:
Plants used for medicinal purposes are extremely perishable, owing to moisture-enhanced enzymatic and microorganism activity, climate change, and improper handling and storage. Experiments have shown that drying the medicinal plant without affecting the active nutrients and controlling the moisture content as much as possible can extend its shelf life. Different traditional and modern drying techniques for preserving medicinal plants have been developed, with some still being improved in Sub-Saharan Africa. However, many of these methods fail to address the most common issues encountered when drying medicinal plants, such as nutrient loss, long drying times, and a limited capacity to dry during the evening or cloudy hours. Heat pump drying is an alternate drying method that results in no nutritional loss. Furthermore, combining a heat pump dryer with a solar energy storage system appears to be a viable option for all-weather drying without affecting the nutritional values of dried products. In this study, a solar-assisted heat pump dryer integrated with thermal energy storage is developed for drying moringa leaves. The study also discusses the performance analysis of the developed dryer as well as the proximate analysis of the dried moringa leaves. All experiments were conducted from 11 a.m. to 4 p.m. to assess the dryer's performance in “daytime mode”. Experiment results show that the drying time was significantly reduced, and the dryer demonstrated high performance in preserving all of the nutrients. In 5 hours of the drying process, the moisture content was reduced from 75.7 to 3.3%. The average COP value was 3.36, confirming the dryer's low energy consumption. The findings also revealed that after drying, the content of protein, carbohydrates, fats, fiber, and ash greatly increased.Keywords: heat pump dryer, efficiency, moringa leaves, proximate analysis
Procedia PDF Downloads 821644 Approaches to Reduce the Complexity of Mathematical Models for the Operational Optimization of Large-Scale Virtual Power Plants in Public Energy Supply
Authors: Thomas Weber, Nina Strobel, Thomas Kohne, Eberhard Abele
Abstract:
In context of the energy transition in Germany, the importance of so-called virtual power plants in the energy supply continues to increase. The progressive dismantling of the large power plants and the ongoing construction of many new decentralized plants result in great potential for optimization through synergies between the individual plants. These potentials can be exploited by mathematical optimization algorithms to calculate the optimal application planning of decentralized power and heat generators and storage systems. This also includes linear or linear mixed integer optimization. In this paper, procedures for reducing the number of decision variables to be calculated are explained and validated. On the one hand, this includes combining n similar installation types into one aggregated unit. This aggregated unit is described by the same constraints and target function terms as a single plant. This reduces the number of decision variables per time step and the complexity of the problem to be solved by a factor of n. The exact operating mode of the individual plants can then be calculated in a second optimization in such a way that the output of the individual plants corresponds to the calculated output of the aggregated unit. Another way to reduce the number of decision variables in an optimization problem is to reduce the number of time steps to be calculated. This is useful if a high temporal resolution is not necessary for all time steps. For example, the volatility or the forecast quality of environmental parameters may justify a high or low temporal resolution of the optimization. Both approaches are examined for the resulting calculation time as well as for optimality. Several optimization models for virtual power plants (combined heat and power plants, heat storage, power storage, gas turbine) with different numbers of plants are used as a reference for the investigation of both processes with regard to calculation duration and optimality.Keywords: CHP, Energy 4.0, energy storage, MILP, optimization, virtual power plant
Procedia PDF Downloads 1781643 Legal Personality and Responsibility of Robots
Authors: Mehrnoosh Abouzari, Shahrokh Sahraei
Abstract:
Arrival of artificial intelligence or smart robots in the modern world put them in charge on pericise and at risk. So acting human activities with robots makes criminal or civil responsibilities for their acts or behavior. The practical usage of smart robots has entered them in to a unique situation when naturalization happens and smart robots are identifies as members of society. There would be some legal situation by adopting these new smart citizens. The first situation is about legal responsibility of robots. Recognizing the naturalization of robot involves some basic right , so humans have the rights of employment, property, housing, using energy and other human rights may be employed for robots. So how would be the practice of these rights in the society and if some problems happens with these rights, how would the civil responsibility and punishment? May we consider them as population and count on the social programs? The second episode is about the criminal responsibility of robots in important activity instead of human that is the aim of inventing robots with handling works in AI technology , but the problem arises when some accidents are happened by robots who are in charge of important activities like army, surgery, transporting, judgement and so on. Moreover, recognizing independent identification for robots in the legal world by register ID cards, naturalization and civilian rights makes and prepare the same rights and obligations of human. So, the civil responsibility is not avoidable and if the robot commit a crime it would have criminal responsibility and have to be punished. The basic component of criminal responsibility may changes in so situation. For example, if designation for criminal responsibility bounds to human by sane, maturity, voluntariness, it would be for robots by being intelligent, good programming, not being hacked and so on. So it is irrational to punish robots by prisoning , execution and other human punishments for body. We may determine to make digital punishments like changing or repairing programs, exchanging some parts of its body or wreck it down completely. Finally the responsibility of the smart robot creators, programmers, the boss in chief, the organization who employed robot, the government which permitted to use robot in important bases and activities , will be analyzing and investigating in their article.Keywords: robot, artificial intelligence, personality, responsibility
Procedia PDF Downloads 1471642 Comparison of Water Equivalent Ratio of Several Dosimetric Materials in Proton Therapy Using Monte Carlo Simulations and Experimental Data
Authors: M. R. Akbari , H. Yousefnia, E. Mirrezaei
Abstract:
Range uncertainties of protons are currently a topic of interest in proton therapy. Two of the parameters that are often used to specify proton range are water equivalent thickness (WET) and water equivalent ratio (WER). Since WER values for a specific material is nearly constant at different proton energies, it is a more useful parameter to compare. In this study, WER values were calculated for different proton energies in polymethyl methacrylate (PMMA), polystyrene (PS) and aluminum (Al) using FLUKA and TRIM codes. The results were compared with analytical, experimental and simulated SEICS code data obtained from the literature. In FLUKA simulation, a cylindrical phantom, 1000 mm in height and 300 mm in diameter, filled with the studied materials was simulated. A typical mono-energetic proton pencil beam in a wide range of incident energies usually applied in proton therapy (50 MeV to 225 MeV) impinges normally on the phantom. In order to obtain the WER values for the considered materials, cylindrical detectors, 1 mm in height and 20 mm in diameter, were also simulated along the beam trajectory in the phantom. In TRIM calculations, type of projectile, energy and angle of incidence, type of target material and thickness should be defined. The mode of 'detailed calculation with full damage cascades' was selected for proton transport in the target material. The biggest difference in WER values between the codes was 3.19%, 1.9% and 0.67% for Al, PMMA and PS, respectively. In Al and PMMA, the biggest difference between each code and experimental data was 1.08%, 1.26%, 2.55%, 0.94%, 0.77% and 0.95% for SEICS, FLUKA and SRIM, respectively. FLUKA and SEICS had the greatest agreement (≤0.77% difference in PMMA and ≤1.08% difference in Al, respectively) with the available experimental data in this study. It is concluded that, FLUKA and TRIM codes have capability for Bragg curves simulation and WER values calculation in the studied materials. They can also predict Bragg peak location and range of proton beams with acceptable accuracy.Keywords: water equivalent ratio, dosimetric materials, proton therapy, Monte Carlo simulations
Procedia PDF Downloads 3241641 Big Data’s Mechanistic View of Human Behavior May Displace Traditional Library Missions That Empower Users
Authors: Gabriel Gomez
Abstract:
The very concept of information seeking behavior, and the means by which librarians teach users to gain information, that is information literacy, are at the heart of how libraries deliver information, but big data will forever change human interaction with information and the way such behavior is both studied and taught. Just as importantly, big data will orient the study of behavior towards commercial ends because of a tendency towards instrumentalist views of human behavior, something one might also call a trend towards behaviorism. This oral presentation seeks to explore how the impact of big data on understandings of human behavior might impact a library information science (LIS) view of human behavior and information literacy, and what this might mean for social justice aims and concomitant community action normally at the center of librarianship. The methodology employed here is a non-empirical examination of current understandings of LIS in regards to social justice alongside an examination of the benefits and dangers foreseen with the growth of big data analysis. The rise of big data within the ever-changing information environment encapsulates a shift to a more mechanistic view of human behavior, one that can easily encompass information seeking behavior and information use. As commercial aims displace the important political and ethical aims that are often central to the missions espoused by libraries and the social sciences, the very altruism and power relations found in LIS are at risk. In this oral presentation, an examination of the social justice impulses of librarians regarding power and information demonstrates how such impulses can be challenged by big data, particularly as librarians understand user behavior and promote information literacy. The creeping behaviorist impulse inherent in the emphasis big data places on specific solutions, that is answers to question that ask how, as opposed to larger questions that hint at an understanding of why people learn or use information threaten library information science ideals. Together with the commercial nature of most big data, this existential threat can harm the social justice nature of librarianship.Keywords: big data, library information science, behaviorism, librarianship
Procedia PDF Downloads 383