Search results for: memory network
794 Deterministic and Stochastic Modeling of a Micro-Grid Management for Optimal Power Self-Consumption
Authors: D. Calogine, O. Chau, S. Dotti, O. Ramiarinjanahary, P. Rasoavonjy, F. Tovondahiniriko
Abstract:
Mafate is a natural circus in the north-western part of Reunion Island, without an electrical grid and road network. A micro-grid concept is being experimented in this area, composed of a photovoltaic production combined with electrochemical batteries, in order to meet the local population for self-consumption of electricity demands. This work develops a discrete model as well as a stochastic model in order to reach an optimal equilibrium between production and consumptions for a cluster of houses. The management of the energy power leads to a large linearized programming system, where the time interval of interest is 24 hours The experimental data are solar production, storage energy, and the parameters of the different electrical devices and batteries. The unknown variables to evaluate are the consumptions of the various electrical services, the energy drawn from and stored in the batteries, and the inhabitants’ planning wishes. The objective is to fit the solar production to the electrical consumption of the inhabitants, with an optimal use of the energies in the batteries by satisfying as widely as possible the users' planning requirements. In the discrete model, the different parameters and solutions of the linear programming system are deterministic scalars. Whereas in the stochastic approach, the data parameters and the linear programming solutions become random variables, then the distributions of which could be imposed or established by estimation from samples of real observations or from samples of optimal discrete equilibrium solutions.Keywords: photovoltaic production, power consumption, battery storage resources, random variables, stochastic modeling, estimations of probability distributions, mixed integer linear programming, smart micro-grid, self-consumption of electricity.
Procedia PDF Downloads 110793 Functionalized Nano porous Ceramic Membranes for Electrodialysis Treatment of Harsh Wastewater
Authors: Emily Rabe, Stephanie Candelaria, Rachel Malone, Olivia Lenz, Greg Newbloom
Abstract:
Electrodialysis (ED) is a well-developed technology for ion removal in a variety of applications. However, many industries generate harsh wastewater streams that are incompatible with traditional ion exchange membranes. Membrion® has developed novel ceramic-based ion exchange membranes (IEMs) offering several advantages over traditional polymer membranes: high performance in low pH, chemical resistance to oxidizers, and a rigid structure that minimizes swelling. These membranes are synthesized with our patented silane-based sol-gel techniques. The pore size, shape, and network structure are engineered through a molecular self-assembly process where thermodynamic driving forces are used to direct where and how pores form. Either cationic or anionic groups can be added within the membrane nanopore structure to create cation- and anion-exchange membranes. The ceramic IEMs are produced on a roll-to-roll manufacturing line with low-temperature processing. Membrane performance testing is conducted using in-house permselectivity, area-specific resistance, and ED stack testing setups. Ceramic-based IEMs show comparable performance to traditional IEMs and offer some unique advantages. Long exposure to highly acidic solutions has a negligible impact on ED performance. Additionally, we have observed stable performance in the presence of strong oxidizing agents such as hydrogen peroxide. This stability is expected, as the ceramic backbone of these materials is already in a fully oxidized state. This data suggests ceramic membranes, made using sol-gel chemistry, could be an ideal solution for acidic and/or oxidizing wastewater streams from processes such as semiconductor manufacturing and mining.Keywords: ion exchange, membrane, silane chemistry, nanostructure, wastewater
Procedia PDF Downloads 86792 Detect Critical Thinking Skill in Written Text Analysis. The Use of Artificial Intelligence in Text Analysis vs Chat/Gpt
Authors: Lucilla Crosta, Anthony Edwards
Abstract:
Companies and the market place nowadays struggle to find employees with adequate skills in relation to anticipated growth of their businesses. At least half of workers will need to undertake some form of up-skilling process in the next five years in order to remain aligned with the requests of the market . In order to meet these challenges, there is a clear need to explore the potential uses of AI (artificial Intelligence) based tools in assessing transversal skills (critical thinking, communication and soft skills of different types in general) of workers and adult students while empowering them to develop those same skills in a reliable trustworthy way. Companies seek workers with key transversal skills that can make a difference between workers now and in the future. However, critical thinking seems to be the one of the most imprtant skill, bringing unexplored ideas and company growth in business contexts. What employers have been reporting since years now, is that this skill is lacking in the majority of workers and adult students, and this is particularly visible trough their writing. This paper investigates how critical thinking and communication skills are currently developed in Higher Education environments through use of AI tools at postgraduate levels. It analyses the use of a branch of AI namely Machine Learning and Big Data and of Neural Network Analysis. It also examines the potential effect the acquisition of these skills through AI tools and what kind of effects this has on employability This paper will draw information from researchers and studies both at national (Italy & UK) and international level in Higher Education. The issues associated with the development and use of one specific AI tool Edulai, will be examined in details. Finally comparisons will be also made between these tools and the more recent phenomenon of Chat GPT and forthcomings and drawbacks will be analysed.Keywords: critical thinking, artificial intelligence, higher education, soft skills, chat GPT
Procedia PDF Downloads 113791 Application of a Confirmatory Composite Model for Assessing the Extent of Agricultural Digitalization: A Case of Proactive Land Acquisition Strategy (PLAS) Farmers in South Africa
Authors: Mazwane S., Makhura M. N., Ginege A.
Abstract:
Digitalization in South Africa has received considerable attention from policymakers. The support for the development of the digital economy by the South African government has been demonstrated through the enactment of various national policies and strategies. This study sought to develop an index for agricultural digitalization by applying composite confirmatory analysis (CCA). Another aim was to determine the factors that affect the development of digitalization in PLAS farms. Data on the indicators of the three dimensions of digitalization were collected from 300 Proactive Land Acquisition Strategy (PLAS) farms in South Africa using semi-structured questionnaires. Confirmatory composite analysis (CCA) was employed to reduce the items into three digitalization dimensions and ultimately to a digitalization index. Standardized digitalization index scores were extracted and fitted to a linear regression model to determine the factors affecting digitalization development. The results revealed that the model shows practical validity and can be used to measure digitalization development as measures of fit (geodesic distance, standardized root mean square residual, and squared Euclidean distance) were all below their respective 95%quantiles of bootstrap discrepancies (HI95 values). Therefore, digitalization is an emergent variable that can be measured using CCA. The average level of digitalization in PLAS farms was 0.2 and varied significantly across provinces. The factors that significantly influence digitalization development in PLAS land reform farms were age, gender, farm type, network type, and cellular data type. This should enable researchers and policymakers to understand the level of digitalization and patterns of development, as well as correctly attribute digitalization development to the contributing factors.Keywords: agriculture, digitalization, confirmatory composite model, land reform, proactive land acquisition strategy, South Africa
Procedia PDF Downloads 65790 An Evolutionary Perspective on the Role of Extrinsic Noise in Filtering Transcript Variability in Small RNA Regulation in Bacteria
Authors: Rinat Arbel-Goren, Joel Stavans
Abstract:
Cell-to-cell variations in transcript or protein abundance, called noise, may give rise to phenotypic variability between isogenic cells, enhancing the probability of survival under stress conditions. These variations may be introduced by post-transcriptional regulatory processes such as non-coding, small RNAs stoichiometric degradation of target transcripts in bacteria. We study the iron homeostasis network in Escherichia coli, in which the RyhB small RNA regulates the expression of various targets as a model system. Using fluorescence reporter genes to detect protein levels and single-molecule fluorescence in situ hybridization to monitor transcripts levels in individual cells, allows us to compare noise at both transcript and protein levels. The experimental results and computer simulations show that extrinsic noise buffers through a feed-forward loop configuration the increase in variability introduced at the transcript level by iron deprivation, illuminating the important role that extrinsic noise plays during stress. Surprisingly, extrinsic noise also decouples of fluctuations of two different targets, in spite of RyhB being a common upstream factor degrading both. Thus, phenotypic variability increases under stress conditions by the decoupling of target fluctuations in the same cell rather than by increasing the noise of each. We also present preliminary results on the adaptation of cells to prolonged iron deprivation in order to shed light on the evolutionary role of post-transcriptional downregulation by small RNAs.Keywords: cell-to-cell variability, Escherichia coli, noise, single-molecule fluorescence in situ hybridization (smFISH), transcript
Procedia PDF Downloads 164789 Human-Machine Cooperation in Facial Comparison Based on Likelihood Scores
Authors: Lanchi Xie, Zhihui Li, Zhigang Li, Guiqiang Wang, Lei Xu, Yuwen Yan
Abstract:
Image-based facial features can be classified into category recognition features and individual recognition features. Current automated face recognition systems extract a specific feature vector of different dimensions from a facial image according to their pre-trained neural network. However, to improve the efficiency of parameter calculation, an algorithm generally reduces the image details by pooling. The operation will overlook the details concerned much by forensic experts. In our experiment, we adopted a variety of face recognition algorithms based on deep learning, compared a large number of naturally collected face images with the known data of the same person's frontal ID photos. Downscaling and manual handling were performed on the testing images. The results supported that the facial recognition algorithms based on deep learning detected structural and morphological information and rarely focused on specific markers such as stains and moles. Overall performance, distribution of genuine scores and impostor scores, and likelihood ratios were tested to evaluate the accuracy of biometric systems and forensic experts. Experiments showed that the biometric systems were skilled in distinguishing category features, and forensic experts were better at discovering the individual features of human faces. In the proposed approach, a fusion was performed at the score level. At the specified false accept rate, the framework achieved a lower false reject rate. This paper contributes to improving the interpretability of the objective method of facial comparison and provides a novel method for human-machine collaboration in this field.Keywords: likelihood ratio, automated facial recognition, facial comparison, biometrics
Procedia PDF Downloads 130788 Adapting Tools for Text Monitoring and for Scenario Analysis Related to the Field of Social Disasters
Authors: Svetlana Cojocaru, Mircea Petic, Inga Titchiev
Abstract:
Humanity faces more and more often with different social disasters, which in turn can generate new accidents and catastrophes. To mitigate their consequences, it is important to obtain early possible signals about the events which are or can occur and to prepare the corresponding scenarios that could be applied. Our research is focused on solving two problems in this domain: identifying signals related that an accident occurred or may occur and mitigation of some consequences of disasters. To solve the first problem, methods of selecting and processing texts from global network Internet are developed. Information in Romanian is of special interest for us. In order to obtain the mentioned tools, we should follow several steps, divided into preparatory stage and processing stage. Throughout the first stage, we manually collected over 724 news articles and classified them into 10 categories of social disasters. It constitutes more than 150 thousand words. Using this information, a controlled vocabulary of more than 300 keywords was elaborated, that will help in the process of classification and identification of the texts related to the field of social disasters. To solve the second problem, the formalism of Petri net has been used. We deal with the problem of inhabitants’ evacuation in useful time. The analysis methods such as reachability or coverability tree and invariants technique to determine dynamic properties of the modeled systems will be used. To perform a case study of properties of extended evacuation system by adding time, the analysis modules of PIPE such as Generalized Stochastic Petri Nets (GSPN) Analysis, Simulation, State Space Analysis, and Invariant Analysis have been used. These modules helped us to obtain the average number of persons situated in the rooms and the other quantitative properties and characteristics related to its dynamics.Keywords: lexicon of disasters, modelling, Petri nets, text annotation, social disasters
Procedia PDF Downloads 197787 Gnss Aided Photogrammetry for Digital Mapping
Authors: Muhammad Usman Akram
Abstract:
This research work based on GNSS-Aided Photogrammetry for Digital Mapping. It focuses on topographic survey of an area or site which is to be used in future Planning & development (P&D) or can be used for further, examination, exploration, research and inspection. Survey and Mapping in hard-to-access and hazardous areas are very difficult by using traditional techniques and methodologies; as well it is time consuming, labor intensive and has less precision with limited data. In comparison with the advance techniques it is saving with less manpower and provides more precise output with a wide variety of multiple data sets. In this experimentation, Aerial Photogrammetry technique is used where an UAV flies over an area and captures geocoded images and makes a Three-Dimensional Model (3-D Model), UAV operates on a user specified path or area with various parameters; Flight altitude, Ground sampling distance (GSD), Image overlapping, Camera angle etc. For ground controlling, a network of points on the ground would be observed as a Ground Control point (GCP) using Differential Global Positioning System (DGPS) in PPK or RTK mode. Furthermore, that raw data collected by UAV and DGPS will be processed in various Digital image processing programs and Computer Aided Design software. From which as an output we obtain Points Dense Cloud, Digital Elevation Model (DEM) and Ortho-photo. The imagery is converted into geospatial data by digitizing over Ortho-photo, DEM is further converted into Digital Terrain Model (DTM) for contour generation or digital surface. As a result, we get Digital Map of area to be surveyed. In conclusion, we compared processed data with exact measurements taken on site. The error will be accepted if the amount of error is not breached from survey accuracy limits set by concerned institutions.Keywords: photogrammetry, post processing kinematics, real time kinematics, manual data inquiry
Procedia PDF Downloads 33786 Investigation of Different Machine Learning Algorithms in Large-Scale Land Cover Mapping within the Google Earth Engine
Authors: Amin Naboureh, Ainong Li, Jinhu Bian, Guangbin Lei, Hamid Ebrahimy
Abstract:
Large-scale land cover mapping has become a new challenge in land change and remote sensing field because of involving a big volume of data. Moreover, selecting the right classification method, especially when there are different types of landscapes in the study area is quite difficult. This paper is an attempt to compare the performance of different machine learning (ML) algorithms for generating a land cover map of the China-Central Asia–West Asia Corridor that is considered as one of the main parts of the Belt and Road Initiative project (BRI). The cloud-based Google Earth Engine (GEE) platform was used for generating a land cover map for the study area from Landsat-8 images (2017) by applying three frequently used ML algorithms including random forest (RF), support vector machine (SVM), and artificial neural network (ANN). The selected ML algorithms (RF, SVM, and ANN) were trained and tested using reference data obtained from MODIS yearly land cover product and very high-resolution satellite images. The finding of the study illustrated that among three frequently used ML algorithms, RF with 91% overall accuracy had the best result in producing a land cover map for the China-Central Asia–West Asia Corridor whereas ANN showed the worst result with 85% overall accuracy. The great performance of the GEE in applying different ML algorithms and handling huge volume of remotely sensed data in the present study showed that it could also help the researchers to generate reliable long-term land cover change maps. The finding of this research has great importance for decision-makers and BRI’s authorities in strategic land use planning.Keywords: land cover, google earth engine, machine learning, remote sensing
Procedia PDF Downloads 113785 Impact of Charging PHEV at Different Penetration Levels on Power System Network
Authors: M. R. Ahmad, I. Musirin, M. M. Othman, N. A. Rahmat
Abstract:
Plug-in Hybrid-Electric Vehicle (PHEV) has gained immense popularity in recent years. PHEV offers numerous advantages compared to the conventional internal-combustion engine (ICE) vehicle. Millions of PHEVs are estimated to be on the road in the USA by 2020. Uncoordinated PHEV charging is believed to cause severe impacts to the power grid; i.e. feeders, lines and transformers overload and voltage drop. Nevertheless, improper PHEV data model used in such studies may cause the findings of their works is in appropriated. Although smart charging is more attractive to researchers in recent years, its implementation is not yet attainable on the street due to its requirement for physical infrastructure readiness and technology advancement. As the first step, it is finest to study the impact of charging PHEV based on real vehicle travel data from National Household Travel Survey (NHTS) and at present charging rate. Due to the lack of charging station on the street at the moment, charging PHEV at home is the best option and has been considered in this work. This paper proposed a technique that comprehensively presents the impact of charging PHEV on power system networks considering huge numbers of PHEV samples with its traveling data pattern. Vehicles Charging Load Profile (VCLP) is developed and implemented in IEEE 30-bus test system that represents a portion of American Electric Power System (Midwestern US). Normalization technique is used to correspond to real time loads at all buses. Results from the study indicated that charging PHEV using opportunity charging will have significant impacts on power system networks, especially whereas bigger battery capacity (kWh) is used as well as for higher penetration level.Keywords: plug-in hybrid electric vehicle, transportation electrification, impact of charging PHEV, electricity demand profile, load profile
Procedia PDF Downloads 288784 Toxic Masculinity as Dictatorship: Gender and Power Struggles in Tomás Eloy Martínez´s Novels
Authors: Mariya Dzhyoyeva
Abstract:
In the present paper, I examine manifestations of toxic masculinity in the novels by Tomás Eloy Martínez, a post-Boom author, journalist, literary critic, and one of the representatives of the Argentine writing diaspora. I focus on the analysis of Martínez´s characters that display hypermasculine traits to define the relationship between toxic masculinity and power, including the power of authorship and violence as they are represented in his novels. The analysis reveals a complex network in which gender, power, and violence are intertwined and influence and modify each other. As the author exposes toxic masculine behaviors that generate violence, he looks to undermine them. Departing from M. Kimmel´s idea of masculinity as homophobia, I examine how Martínez “outs” his characters by incorporating into the narrative some secret, privileged sources that provide alternative accounts of their otherwise hypermasculine lives. These background stories expose their “weaknesses,” both physical and mental, and thereby feminize them in their own eyes. In a similar way, the toxic masculinity of the fictional male author that wields his power by abusing the written word as he abuses the female character in the story is exposed as a complex of insecurities accumulated by the character due to his childhood trauma. The artistic technique that Martínez uses to condemn the authoritarian male behavior is accessing his subjectivity and subverting it through a multiplicity of identities. Martínez takes over the character’s “I” and turns it into a host of pronouns with a constantly shifting point of reference that distorts not only the notions of gender but also the very notion of identity. In doing so, he takes the character´s affirmation of masculinity to the limit where the very idea of it becomes unsustainable. Viewed in the context of Martínez´s own exilic story, the condemnation of toxic masculine power turns into the condemnation of dictatorship and authoritarianism.Keywords: gender, masculinity., toxic masculinity, authoritarian, Argentine literature, Martínez
Procedia PDF Downloads 71783 Comparison of Two Neural Networks To Model Margarine Age And Predict Shelf-Life Using Matlab
Authors: Phakamani Xaba, Robert Huberts, Bilainu Oboirien
Abstract:
The present study was aimed at developing & comparing two neural-network-based predictive models to predict shelf-life/product age of South African margarine using free fatty acid (FFA), water droplet size (D3.3), water droplet distribution (e-sigma), moisture content, peroxide value (PV), anisidine valve (AnV) and total oxidation (totox) value as input variables to the model. Brick margarine products which had varying ages ranging from fresh i.e. week 0 to week 47 were sourced. The brick margarine products which had been stored at 10 & 25 °C and were characterized. JMP and MATLAB models to predict shelf-life/ margarine age were developed and their performances were compared. The key performance indicators to evaluate the model performances were correlation coefficient (CC), root mean square error (RMSE), and mean absolute percentage error (MAPE) relative to the actual data. The MATLAB-developed model showed a better performance in all three performance indicators. The correlation coefficient of the MATLAB model was 99.86% versus 99.74% for the JMP model, the RMSE was 0.720 compared to 1.005 and the MAPE was 7.4% compared to 8.571%. The MATLAB model was selected to be the most accurate, and then, the number of hidden neurons/ nodes was optimized to develop a single predictive model. The optimized MATLAB with 10 neurons showed a better performance compared to the models with 1 & 5 hidden neurons. The developed models can be used by margarine manufacturers, food research institutions, researchers etc, to predict shelf-life/ margarine product age, optimize addition of antioxidants, extend shelf-life of products and proactively troubleshoot for problems related to changes which have an impact on shelf-life of margarine without conducting expensive trials.Keywords: margarine shelf-life, predictive modelling, neural networks, oil oxidation
Procedia PDF Downloads 200782 Efficient Compact Micro Dielectric Barrier Discharge (DBD) Plasma Reactor for Ozone Generation for Industrial Application in Liquid and Gas Phase Systems
Authors: D. Kuvshinov, A. Siswanto, J. Lozano-Parada, W. Zimmerman
Abstract:
Ozone is well known as a powerful fast reaction rate oxidant. The ozone based processes produce no by-product left as a non-reacted ozone returns back to the original oxygen molecule. Therefore an application of ozone is widely accepted as one of the main directions for a sustainable and clean technologies development. There are number of technologies require ozone to be delivered to specific points of a production network or reactors construction. Due to space constrains, high reactivity and short life time of ozone the use of ozone generators even of a bench top scale is practically limited. This requires development of mini/micro scale ozone generator which can be directly incorporated into production units. Our report presents a feasibility study of a new micro scale rector for ozone generation (MROG). Data on MROG calibration and indigo decomposition at different operation conditions are presented. At selected operation conditions with residence time of 0.25 s the process of ozone generation is not limited by reaction rate and the amount of ozone produced is a function of power applied. It was shown that the MROG is capable to produce ozone at voltage level starting from 3.5kV with ozone concentration of 5.28E-6 (mol/L) at 5kV. This is in line with data presented on numerical investigation for a MROG. It was shown that in compare to a conventional ozone generator, MROG has lower power consumption at low voltages and atmospheric pressure. The MROG construction makes it applicable for emerged and dry systems. With a robust compact design MROG can be used as incorporated unit for production lines of high complexity.Keywords: dielectric barrier discharge (DBD), micro reactor, ozone, plasma
Procedia PDF Downloads 338781 Hyper Parameter Optimization of Deep Convolutional Neural Networks for Pavement Distress Classification
Authors: Oumaima Khlifati, Khadija Baba
Abstract:
Pavement distress is the main factor responsible for the deterioration of road structure durability, damage vehicles, and driver comfort. Transportation agencies spend a high proportion of their funds on pavement monitoring and maintenance. The auscultation of pavement distress was based on the manual survey, which was extremely time consuming, labor intensive, and required domain expertise. Therefore, the automatic distress detection is needed to reduce the cost of manual inspection and avoid more serious damage by implementing the appropriate remediation actions at the right time. Inspired by recent deep learning applications, this paper proposes an algorithm for automatic road distress detection and classification using on the Deep Convolutional Neural Network (DCNN). In this study, the types of pavement distress are classified as transverse or longitudinal cracking, alligator, pothole, and intact pavement. The dataset used in this work is composed of public asphalt pavement images. In order to learn the structure of the different type of distress, the DCNN models are trained and tested as a multi-label classification task. In addition, to get the highest accuracy for our model, we adjust the structural optimization hyper parameters such as the number of convolutions and max pooling, filers, size of filters, loss functions, activation functions, and optimizer and fine-tuning hyper parameters that conclude batch size and learning rate. The optimization of the model is executed by checking all feasible combinations and selecting the best performing one. The model, after being optimized, performance metrics is calculated, which describe the training and validation accuracies, precision, recall, and F1 score.Keywords: distress pavement, hyperparameters, automatic classification, deep learning
Procedia PDF Downloads 94780 A Critical Study on Unprecedented Employment Discrimination and Growth of Contractual Labour Engaged by Rail Industry in India
Authors: Munmunlisa Mohanty, K. D. Raju
Abstract:
Rail industry is one of the model employers in India has separate national legislation (Railways Act 1989) to regulate its vast employment structure, functioning across the country. Indian Railway is not only the premier transport industry of the country; indeed, it is Asia’s most extensive rail network organisation and the world’s second-largest industry functioning under one management. With the growth of globalization of industrial products, the scope of anti-employment discrimination is no more confined to gender aspect only; instead, it extended to the unregularized classification of labour force applicable in the various industrial establishments in India. And the Indian Rail Industry inadvertently enhanced such discriminatory employment trends by engaging contractual labour in an unprecedented manner. The engagement of contractual labour by rail industry vanished the core “Employer-Employee” relationship between rail management and contractual labour who employed through the contractor. This employment trend reduces the cost of production and supervision, discourages the contractual labour from forming unions, and reduces its collective bargaining capacity. So, the primary intention of this paper is to highlight the increasing discriminatory employment scope for contractual labour engaged by Indian Railways. This paper critically analyses the diminishing perspective of anti-employment opportunity practiced by Indian Railways towards contractual labour and demands an urgent outlook on the probable scope of anti-employment discrimination against contractual labour engaged by Indian Railways. The researcher used doctrinal methodology where primary materials (Railways Act, Contract Labour Act and Occupational, health and Safety Code, 2020) and secondary data (CAG Report 2018, Railways Employment Regulation Rules, ILO Report etc.) are used for the paper.Keywords: anti-employment, CAG Report, contractual labour, discrimination, Indian Railway, principal employer
Procedia PDF Downloads 172779 Nondestructive Acoustic Microcharacterisation of Gamma Irradiation Effects on Sodium Oxide Borate Glass X2Na2O-X2B2O3 by Acoustic Signature
Authors: Ibrahim Al-Suraihy, Abdellaziz Doghmane, Zahia Hadjoub
Abstract:
We discuss in this work the elastic properties by using acoustic microscopes to measure Rayleigh and longitudinal wave velocities in a no radiated and radiated sodium borate glasses X2Na2O-X2B2O3 with 0 ≤ x ≤ 27 (mol %) at microscopic resolution. The acoustic material signatures were first measured, from which the characteristic surface velocities were determined.Longitudinal and shear ultrasonic velocities were measured in a different composition of sodium borate glass samples before and after irradiation with γ-rays. Results showed that the effect due to increasing sodium oxide content on the ultrasonic velocity appeared more clearly than due to γ-radiation. It was found that as Na2O composition increases, longitudinal velocities vary from 3832 to 5636 m/s in irradiated sample and it vary from 4010 to 5836 m/s in high radiated sample by 10 dose whereas shear velocities vary from 2223 to 3269 m/s in irradiated sample and it vary from 2326 m/s in low radiation to 3385 m/s in high radiated sample by 10 dose. The effect of increasing sodium oxide content on ultrasonic velocity was very clear. The increase of velocity was attributed to the gradual increase in the rigidity of glass and hence strengthening of network due to gradual change of boron atoms from the three-fold to the four-fold coordination of oxygen atoms. The ultrasonic velocities data of glass samples have been used to find the elastic modulus. It was found that ultrasonic velocity, elastic modulus and microhardness increase with increasing barium oxide content and increasing γ-radiation dose.Keywords: mechanical properties X2Na2O-X2B2O3, acoustic signature, SAW velocities, additives, gamma-radiation dose
Procedia PDF Downloads 397778 Incorporation of Growth Factors onto Hydrogels via Peptide Mediated Binding for Development of Vascular Networks
Authors: Katie Kilgour, Brendan Turner, Carly Catella, Michael Daniele, Stefano Menegatti
Abstract:
In vivo, the extracellular matrix (ECM) provides biochemical and mechanical properties that are instructional to resident cells to form complex tissues with characteristics to develop and support vascular networks. In vitro, the development of vascular networks can be guided by biochemical patterning of substrates via spatial distribution and display of peptides and growth factors to prompt cell adhesion, differentiation, and proliferation. We have developed a technique utilizing peptide ligands that specifically bind vascular endothelial growth factor (VEGF), erythropoietin (EPO), or angiopoietin-1 (ANG1) to spatiotemporally distribute growth factors to cells. This allows for the controlled release of each growth factor, ultimately enhancing the formation of a vascular network. Our engineered tissue constructs (ETCs) are fabricated out of gelatin methacryloyl (GelMA), which is an ideal substrate for tailored stiffness and bio-functionality, and covalently patterned with growth factor specific peptides. These peptides mimic growth factor receptors, facilitating the non-covalent binding of the growth factors to the ETC, allowing for facile uptake by the cells. We have demonstrated in the absence of cells the binding affinity of VEGF, EPO, and ANG1 to their respective peptides and the ability for each to be patterned onto a GelMA substrate. The ability to organize growth factors on an ETC provides different functionality to develop organized vascular networks. Our results demonstrated a method to incorporate biochemical cues into ETCs that enable spatial and temporal control of growth factors. Future efforts will investigate the cellular response by evaluating gene expression, quantifying angiogenic activity, and measuring the speed of growth factor consumption.Keywords: growth factor, hydrogel, peptide, angiogenesis, vascular, patterning
Procedia PDF Downloads 165777 Re-Integrating Historic Lakes into the City Fabric in the Case of Vandiyur Lake, Madurai
Authors: Soumya Pugal
Abstract:
The traditional lake system of an ancient town is a network of water holding blue spaces, erected further than 2000 years ago by the rulers of ancient cities and maintained for centuries by the original communities. These blue spaces form a micro-watershed wherein an individual tank has its own catchment, tank bed area, and command area. These lakes are connected by a common sluice from the upstream tank, thereby feeding the downstream tank. The lakes used to be of socio-economic significance in those times, but the rapid growth of the city, as well as the change in systems of ownership of the lakes, have turned them into the backyard of urban development. Madurai is one such historic city to be facing the issues of finding a balance to the social, ecological, and profitable requirements of the people with respect to the traditional lake system. To find a solution to problems caused by the neglect of vital ecological systems of a city, the theory of transformative placemaking through water sensitive urban design has been explored. This approach re-invents the relationship between the people and the urban lakes to suit the modern aspirations while respecting the environment. The thesis aims to develop strategies to guide the development along the major urban lake of Vandiyur to equip the lake to meet the growing requirements of the megacity in terms of its recreational requirements and give a renewed connection between people and water. The intent of the design is to understand the ecological and social structures of the lake and find ways to use the lake to produce social cohesion within the community and balance the city's profitable and ecological requirements by using transformative placemaking through water sensitive urban design..Keywords: urban lakes, urban blue spaces, placemaking, revitalisation of lakes, urban cohesion
Procedia PDF Downloads 76776 The Use of Unmanned Aerial System (UAS) in Improving the Measurement System on the Example of Textile Heaps
Authors: Arkadiusz Zurek
Abstract:
The potential of using drones is visible in many areas of logistics, especially in terms of their use for monitoring and control of many processes. The technologies implemented in the last decade concern new possibilities for companies that until now have not even considered them, such as warehouse inventories. Unmanned aerial vehicles are no longer seen as a revolutionary tool for Industry 4.0, but rather as tools in the daily work of factories and logistics operators. The research problem is to develop a method for measuring the weight of goods in a selected link of the clothing supply chain by drones. However, the purpose of this article is to analyze the causes of errors in traditional measurements, and then to identify adverse events related to the use of drones for the inventory of a heap of textiles intended for production purposes. On this basis, it will be possible to develop guidelines to eliminate the causes of these events in the measurement process using drones. In a real environment, work was carried out to determine the volume and weight of textiles, including, among others, weighing a textile sample to determine the average density of the assortment, establishing a local geodetic network, terrestrial laser scanning and photogrammetric raid using an unmanned aerial vehicle. As a result of the analysis of measurement data obtained in the facility, the volume and weight of the assortment and the accuracy of their determination were determined. In this article, this work presents how such heaps are currently being tested, what adverse events occur, indicate and describes the current use of photogrammetric techniques of this type of measurements so far performed by external drones for the inventory of wind farms or construction of the station and compare them with the measurement system of the aforementioned textile heap inside a large-format facility.Keywords: drones, unmanned aerial system, UAS, indoor system, security, process automation, cost optimization, photogrammetry, risk elimination, industry 4.0
Procedia PDF Downloads 87775 Design of a Standard Weather Data Acquisition Device for the Federal University of Technology, Akure Nigeria
Authors: Isaac Kayode Ogunlade
Abstract:
Data acquisition (DAQ) is the process by which physical phenomena from the real world are transformed into an electrical signal(s) that are measured and converted into a digital format for processing, analysis, and storage by a computer. The DAQ is designed using PIC18F4550 microcontroller, communicating with Personal Computer (PC) through USB (Universal Serial Bus). The research deployed initial knowledge of data acquisition system and embedded system to develop a weather data acquisition device using LM35 sensor to measure weather parameters and the use of Artificial Intelligence(Artificial Neural Network - ANN)and statistical approach(Autoregressive Integrated Moving Average – ARIMA) to predict precipitation (rainfall). The device is placed by a standard device in the Department of Meteorology, Federal University of Technology, Akure (FUTA) to know the performance evaluation of the device. Both devices (standard and designed) were subjected to 180 days with the same atmospheric condition for data mining (temperature, relative humidity, and pressure). The acquired data is trained in MATLAB R2012b environment using ANN, and ARIMAto predict precipitation (rainfall). Root Mean Square Error (RMSE), Mean Absolute Error (MAE), Correction Square (R2), and Mean Percentage Error (MPE) was deplored as standardize evaluation to know the performance of the models in the prediction of precipitation. The results from the working of the developed device show that the device has an efficiency of 96% and is also compatible with Personal Computer (PC) and laptops. The simulation result for acquired data shows that ANN models precipitation (rainfall) prediction for two months (May and June 2017) revealed a disparity error of 1.59%; while ARIMA is 2.63%, respectively. The device will be useful in research, practical laboratories, and industrial environments.Keywords: data acquisition system, design device, weather development, predict precipitation and (FUTA) standard device
Procedia PDF Downloads 93774 Understanding the Semantic Network of Tourism Studies in Taiwan by Using Bibliometrics Analysis
Authors: Chun-Min Lin, Yuh-Jen Wu, Ching-Ting Chung
Abstract:
The formulation of tourism policies requires objective academic research and evidence as support, especially research from local academia. Taiwan is a small island, and its economic growth relies heavily on tourism revenue. Taiwanese government has been devoting to the promotion of the tourism industry over the past few decades. Scientific research outcomes by Taiwanese scholars may and will help lay the foundations for drafting future tourism policy by the government. In this study, a total of 120 full journal articles published between 2008 and 2016 from the Journal of Tourism and Leisure Studies (JTSL) were examined to explore the scientific research trend of tourism study in Taiwan. JTSL is one of the most important Taiwanese journals in the tourism discipline which focuses on tourism-related issues and uses traditional Chinese as the study language. The method of co-word analysis from bibliometrics approaches was employed for semantic analysis in this study. When analyzing Chinese words and phrases, word segmentation analysis is a crucial step. It must be carried out initially and precisely in order to obtain meaningful word or word chunks for further frequency calculation. A word segmentation system basing on N-gram algorithm was developed in this study to conduct semantic analysis, and 100 groups of meaningful phrases with the highest recurrent rates were located. Subsequently, co-word analysis was employed for semantic classification. The results showed that the themes of tourism research in Taiwan in recent years cover the scope of tourism education, environmental protection, hotel management, information technology, and senior tourism. The results can give insight on the related issues and serve as a reference for tourism-related policy making and follow-up research.Keywords: bibliometrics, co-word analysis, word segmentation, tourism research, policy
Procedia PDF Downloads 229773 Impact of Urbanization on Natural Drainage Pattern in District of Larkana, Sindh Pakistan
Authors: Sumaira Zafar, Arjumand Zaidi
Abstract:
During past few years, several floods have adversely affected the areas along lower Indus River. Besides other climate related anomalies, rapidly increasing urbanization and blockage of natural drains due to siltation or encroachments are two other critical causes that may be responsible for these disasters. Due to flat topography of river Indus plains and blockage of natural waterways, drainage of storm water takes time adversely affecting the crop health and soil properties of the area. Government of Sindh is taking a keen interest in revival of natural drainage network in the province and has initiated this work under Sindh Irrigation and Drainage Authority. In this paper, geospatial techniques are used to analyze landuse/land-cover changes of Larkana district over the past three decades (1980-present) and their impact on natural drainage system. Satellite derived Digital Elevation Model (DEM) and topographic sheets (recent and 1950) are used to delineate natural drainage pattern of the district. The urban landuse map developed in this study is further overlaid on drainage line layer to identify the critical areas where the natural floodwater flows are being inhibited by urbanization. Rainfall and flow data are utilized to identify areas of heavy flow, whereas, satellite data including Landsat 7 and Google Earth are used to map previous floods extent and landuse/cover of the study area. Alternatives to natural drainage systems are also suggested wherever possible. The output maps of natural drainage pattern can be used to develop a decision support system for urban planners, Sindh development authorities and flood mitigation and management agencies.Keywords: geospatial techniques, satellite data, natural drainage, flood, urbanization
Procedia PDF Downloads 510772 Synthesis and Properties of Oxidized Corn Starch Based Wood Adhesive
Authors: Salise Oktay, Nilgun Kizilcan, Basak Bengu
Abstract:
At present, formaldehyde-based adhesives such as urea-formaldehyde (UF), melamine-formaldehyde (MF), melamine – urea-formaldehyde (MUF), etc. are mostly used in wood-based panel industry because of their high reactivity, chemical versatility, and economic competitiveness. However, formaldehyde-based wood adhesives are produced from non- renewable resources and also formaldehyde is classified as a probable human carcinogen (Group B1) by the U.S. Environmental Protection Agency (EPA). Therefore, there has been a growing interest in the development of environment-friendly, economically competitive, bio-based wood adhesives to meet wood-based panel industry requirements. In this study, like a formaldehyde-free adhesive, oxidized starch – urea wood adhesives was synthesized. In this scope, firstly, acid hydrolysis of corn starch was conducted and then acid thinned corn starch was oxidized by using hydrogen peroxide and CuSO₄ as an oxidizer and catalyst, respectively. Secondly, the polycondensation reaction between oxidized starch and urea conducted. Finally, nano – TiO₂ was added to the reaction system to strengthen the adhesive network. Solid content, viscosity, and gel time analyses of the prepared adhesive were performed to evaluate the adhesive processability. FTIR, DSC, TGA, SEM characterization techniques were used to investigate chemical structures, thermal, and morphological properties of the adhesive, respectively. Rheological analysis of the adhesive was also performed. In order to evaluate the quality of oxidized corn starch – urea adhesives, particleboards were produced in laboratory scale and mechanical and physical properties of the boards were investigated such as an internal bond, modulus of rupture, modulus of elasticity, formaldehyde emission, etc. The obtained results revealed that oxidized starch – urea adhesives were synthesized successfully and it can be a good potential candidate to use the wood-based panel industry with some developments.Keywords: nano-TiO₂, corn starch, formaldehyde emission, wood adhesives
Procedia PDF Downloads 151771 Theoretical Discussion on the Classification of Risks in Supply Chain Management
Authors: Liane Marcia Freitas Silva, Fernando Augusto Silva Marins, Maria Silene Alexandre Leite
Abstract:
The adoption of a network structure, like in the supply chains, favors the increase of dependence between companies and, by consequence, their vulnerability. Environment disasters, sociopolitical and economical events, and the dynamics of supply chains elevate the uncertainty of their operation, favoring the occurrence of events that can generate break up in the operations and other undesired consequences. Thus, supply chains are exposed to various risks that can influence the profitability of companies involved, and there are several previous studies that have proposed risk classification models in order to categorize the risks and to manage them. The objective of this paper is to analyze and discuss thirty of these risk classification models by means a theoretical survey. The research method adopted for analyzing and discussion includes three phases: The identification of the types of risks proposed in each one of the thirty models, the grouping of them considering equivalent concepts associated to their definitions, and, the analysis of these risks groups, evaluating their similarities and differences. After these analyses, it was possible to conclude that, in fact, there is more than thirty risks types identified in the literature of Supply Chains, but some of them are identical despite of be used distinct terms to characterize them, because different criteria for risk classification are adopted by researchers. In short, it is observed that some types of risks are identified as risk source for supply chains, such as, demand risk, environmental risk and safety risk. On the other hand, other types of risks are identified by the consequences that they can generate for the supply chains, such as, the reputation risk, the asset depreciation risk and the competitive risk. These results are consequence of the disagreements between researchers on risk classification, mainly about what is risk event and about what is the consequence of risk occurrence. An additional study is in developing in order to clarify how the risks can be generated, and which are the characteristics of the components in a Supply Chain that leads to occurrence of risk.Keywords: sisks classification, survey, supply chain management, theoretical discussion
Procedia PDF Downloads 634770 Roundabout Implementation Analyses Based on Traffic Microsimulation Model
Authors: Sanja Šurdonja, Aleksandra Deluka-Tibljaš, Mirna Klobučar, Irena Ištoka Otković
Abstract:
Roundabouts are a common choice in the case of reconstruction of an intersection, whether it is to improve the capacity of the intersection or traffic safety, especially in urban conditions. The regulation for the design of roundabouts is often related to driving culture, the tradition of using this type of intersection, etc. Individual values in the regulation are usually recommended in a wide range (this is the case in Croatian regulation), and the final design of a roundabout largely depends on the designer's experience and his/her choice of design elements. Therefore, before-after analyses are a good way to monitor the performance of roundabouts and possibly improve the recommendations of the regulation. This paper presents a comprehensive before-after analysis of a roundabout on the country road network near Rijeka, Croatia. The analysis is based on a thorough collection of traffic data (operating speeds and traffic load) and design elements data, both before and after the reconstruction into a roundabout. At the chosen location, the roundabout solution aimed to improve capacity and traffic safety. Therefore, the paper analyzed the collected data to see if the roundabout achieved the expected effect. A traffic microsimulation model (VISSIM) of the roundabout was created based on the real collected data, and the influence of the increase of traffic load and different traffic structures, as well as of the selected design elements on the capacity of the roundabout, were analyzed. Also, through the analysis of operating speeds and potential conflicts by application of the Surrogate Safety Assessment Model (SSAM), the traffic safety effect of the roundabout was analyzed. The results of this research show the practical value of before-after analysis as an indicator of roundabout effectiveness at a specific location. The application of a microsimulation model provides a practical method for analyzing intersection functionality from a capacity and safety perspective in present and changed traffic and design conditions.Keywords: before-after analysis, operating speed, capacity, design.
Procedia PDF Downloads 24769 LTE Performance Analysis in the City of Bogota Northern Zone for Two Different Mobile Broadband Operators over Qualipoc
Authors: Víctor D. Rodríguez, Edith P. Estupiñán, Juan C. Martínez
Abstract:
The evolution in mobile broadband technologies has allowed to increase the download rates in users considering the current services. The evaluation of technical parameters at the link level is of vital importance to validate the quality and veracity of the connection, thus avoiding large losses of data, time and productivity. Some of these failures may occur between the eNodeB (Evolved Node B) and the user equipment (UE), so the link between the end device and the base station can be observed. LTE (Long Term Evolution) is considered one of the IP-oriented mobile broadband technologies that work stably for data and VoIP (Voice Over IP) for those devices that have that feature. This research presents a technical analysis of the connection and channeling processes between UE and eNodeB with the TAC (Tracking Area Code) variables, and analysis of performance variables (Throughput, Signal to Interference and Noise Ratio (SINR)). Three measurement scenarios were proposed in the city of Bogotá using QualiPoc, where two operators were evaluated (Operator 1 and Operator 2). Once the data were obtained, an analysis of the variables was performed determining that the data obtained in transmission modes vary depending on the parameters BLER (Block Error Rate), performance and SNR (Signal-to-Noise Ratio). In the case of both operators, differences in transmission modes are detected and this is reflected in the quality of the signal. In addition, due to the fact that both operators work in different frequencies, it can be seen that Operator 1, despite having spectrum in Band 7 (2600 MHz), together with Operator 2, is reassigning to another frequency, a lower band, which is AWS (1700 MHz), but the difference in signal quality with respect to the establishment with data by the provider Operator 2 and the difference found in the transmission modes determined by the eNodeB in Operator 1 is remarkable.Keywords: BLER, LTE, network, qualipoc, SNR.
Procedia PDF Downloads 116768 Flexible PVC Based Nanocomposites With the Incorporation of Electric and Magnetic Nanofillers for the Shielding Against EMI and Thermal Imaging Signals
Authors: H. M. Fayzan Shakir, Khadija Zubair, Tingkai Zhao
Abstract:
Electromagnetic (EM) waves are being used widely now a days. Cell phone signals, WIFI signals, wireless telecommunications etc everything uses EM waves which then create EM pollution. EM pollution can cause serious effects on both human health and nearby electronic devices. EM waves have electric and magnetic components that disturb the flow of charged particles in both human nervous system and electronic devices. The shielding of both humans and electronic devices are a prime concern today. EM waves can cause headaches, anxiety, suicide and depression, nausea, fatigue and loss of libido in humans and malfunctioning in electronic devices. Polyaniline (PANI) and polypyrrole (PPY) were successfully synthesized using chemical polymerizing using ammonium persulfate and DBSNa as oxidant respectively. Barium ferrites (BaFe) were also prepared using co-precipitation method and calcinated at 10500C for 8h. Nanocomposite thin films with various combinations and compositions of Polyvinylchloride, PANI, PPY and BaFe were prepared. X-ray diffraction technique was first used to confirm the successful fabrication of all nano fillers and particle size analyzer to measure the exact size and scanning electron microscopy is used for the shape. According to Electromagnetic Interference theory, electrical conductivity is the prime property required for the Electromagnetic Interference shielding. 4-probe technique is then used to evaluate DC conductivity of all samples. Samples with high concentration of PPY and PANI exhibit remarkable increased electrical conductivity due to fabrication of interconnected network structure inside the Polyvinylchloride matrix that is also confirmed by SEM analysis. Less than 1% transmission was observed in whole NIR region (700 nm – 2500 nm). Also, less than -80 dB Electromagnetic Interference shielding effectiveness was observed in microwave region (0.1 GHz to 20 GHz).Keywords: nanocomposites, polymers, EMI shielding, thermal imaging
Procedia PDF Downloads 108767 Evaluation of Surface Roughness Condition Using App Roadroid
Authors: Diego de Almeida Pereira
Abstract:
The roughness index of a road is considered the most important parameter about the quality of the pavement, as it has a close relation with the comfort and safety of the road users. Such condition can be established by means of functional evaluation of pavement surface deviations, measured by the International Roughness Index (IRI), an index that came out of the international evaluation of pavements, coordinated by the World Bank, and currently owns, as an index of limit measure, for purposes of receiving roads in Brazil, the value of 2.7 m/km. This work make use of the e.IRI parameter, obtained by the Roadroid app. for smartphones which use Android operating system. The choice of such application is due to the practicality for the user interaction, as it possesses a data storage on a cloud of its own, and the support given to universities all around the world. Data has been collected for six months, once in each month. The studies begun in March 2018, season of precipitations that worsen the conditions of the roads, besides the opportunity to accompany the damage and the quality of the interventions performed. About 350 kilometers of sections of four federal highways were analyzed, BR-020, BR-040, BR-060 and BR-070 that connect the Federal District (area where Brasilia is located) and surroundings, chosen for their economic and tourist importance, been two of them of federal and two others of private exploitation. As well as much of the road network, the analyzed stretches are coated of Hot Mix Asphalt (HMA). Thus, this present research performs a contrastive discussion between comfort conditions and safety of the roads under private exploitation in which users pay a fee to the concessionaires so they could travel on a road that meet the minimum requirements for usage, and regarding the quality of offered service on the roads under Federal Government jurisdiction. And finally, the contrast of data collected by National Department of Transport Infrastructure – DNIT, by means of a laser perfilometer, with data achieved by Roadroid, checking the applicability, the practicality and cost-effective, considering the app limitations.Keywords: roadroid, international roughness index, Brazilian roads, pavement
Procedia PDF Downloads 86766 Characterization and Correlation of Neurodegeneration and Biological Markers of Model Mice with Traumatic Brain Injury and Alzheimer's Disease
Authors: J. DeBoard, R. Dietrich, J. Hughes, K. Yurko, G. Harms
Abstract:
Alzheimer’s disease (AD) is a predominant type of dementia and is likely a major cause of neural network impairment. The pathogenesis of this neurodegenerative disorder has yet to be fully elucidated. There are currently no known cures for the disease, and the best hope is to be able to detect it early enough to impede its progress. Beyond age and genetics, another prevalent risk factor for AD might be traumatic brain injury (TBI), which has similar neurodegenerative hallmarks. Our research focuses on obtaining information and methods to be able to predict when neurodegenerative effects might occur at a clinical level by observation of events at a cellular and molecular level in model mice. First, we wish to introduce our evidence that brain damage can be observed via brain imaging prior to the noticeable loss of neuromuscular control in model mice of AD. We then show our evidence that some blood biomarkers might be able to be early predictors of AD in the same model mice. Thus, we were interested to see if we might be able to predict which mice might show long-term neurodegenerative effects due to differing degrees of TBI and what level of TBI causes further damage and earlier death to the AD model mice. Upon application of TBIs via an apparatus to effectively induce extremely mild to mild TBIs, wild-type (WT) mice and AD mouse models were tested for cognition, neuromuscular control, olfactory ability, blood biomarkers, and brain imaging. Experiments are currently still in process, and more results are therefore forthcoming. Preliminary data suggest that neuromotor control diminishes as well as olfactory function for both AD and WT mice after the administration of five consecutive mild TBIs. Also, seizure activity increases significantly for both AD and WT after the administration of the five TBI treatment. If future data supports these findings, important implications about the effect of TBI on those at risk for AD might be possible.Keywords: Alzheimer's disease, blood biomarker, neurodegeneration, neuromuscular control, olfaction, traumatic brain injury
Procedia PDF Downloads 141765 A Critical Reflection of Ableist Methodologies: Approaching Interviews and Go-Along Interviews
Authors: Hana Porkertová, Pavel Doboš
Abstract:
Based on a research project studying the experience of visually disabled people with urban space in the Czech Republic, the conference contribution discusses the limits of social-science methodologies used in sociology and human geography. It draws on actor-network theory, assuming that science does not describe reality but produces it. Methodology connects theory, research questions, ways to answer them (methods), and results. A research design utilizing ableist methodologies can produce ableist realities. Therefore, it was necessary to adjust the methods so that they could mediate blind experience to the scientific community without reproducing ableism. The researchers faced multiple challenges, ranging from questionable validity to how to research experience that differs from that of the researchers who are able-bodied. Finding a suitable theory that could be used as an analytical tool that would demonstrate space and blind experience as multiple, dynamic, and mutually constructed was the first step that could offer a range of potentially productive methods and research questions, as well as bring critically reflected results. Poststructural theory, mainly Deleuze-Guattarian philosophy, was chosen, and two methods were used: interviews and go-along interviews that had to be adjusted to be able to explore blind experience. In spite of a thorough preparation of these methods, new difficulties kept emerging, which exposed the ableist character of scientific knowledge. From the beginning of data collecting, there was an agreement to work in teams with slightly different roles of each of the researchers, which was significant especially during go-along interviews. In some cases, the anticipations of the researchers and participants differed, which led to unexpected and potentially dangerous situations. These were not caused only by the differences between scientific and lay communities but also between able-bodied and disabled people. Researchers were sometimes assigned to the assistants’ roles, and this new position – doing research together – required further negotiations, which also opened various ethical questions.Keywords: ableist methodology, blind experience, go-along interviews, research ethics, scientific knowledge
Procedia PDF Downloads 166