Search results for: parent sensitivity
601 Numerical Aeroacoustics Investigation of Eroded and Coated Leading Edge of NACA 64- 618 Airfoil
Authors: Zeinab Gharibi, B. Stoevesandt, J. Peinke
Abstract:
Long term surface erosion of wind turbine blades, especially at the leading edge, impairs aerodynamic performance; therefore, lowers efficiency of the blades mostly in the high-speed rotor tip regions. Blade protection provides significant improvements in annual energy production, reduces costly downtime, and protects the integrity of the blades. However, this protection still influences the aerodynamic behavior, and broadband noise caused by interaction between the impinging turbulence and blade’s leading edge. This paper presents an extensive numerical aeroacoustics approach by investigating the sound power spectra of the eroded and coated NACA 64-618 wind turbine airfoil and evaluates aeroacoustics improvements after the protection procedure. Using computational fluid dynamics (CFD), different quasi 2D numerical grids were implemented and special attention was paid to the refinement of the boundary layers. The noise sources were captured and decoupled with acoustic propagation via the derived formulation of Curle’s analogy implemented in OpenFOAM. Therefore, the noise spectra were compared for clean, coated and eroded profiles in the range of chord-based Reynolds number (1.6e6 ≤ Re ≤ 11.5e6). Angle of attack was zero in all cases. Verifications were conducted for the clean profile using available experimental data. Sensitivity studies for the far-field were done on different observational positions. Furthermore, beamforming studies were done simulating an Archimedean spiral microphone array for far-field noise directivity patterns. Comparing the noise spectra of the coated and eroded geometries, results show that, coating clearly improves aerodynamic and acoustic performance of the eroded airfoil.Keywords: computational fluid dynamics, computational aeroacoustics, leading edge, OpenFOAM
Procedia PDF Downloads 223600 Storage System Validation Study for Raw Cocoa Beans Using Minitab® 17 and R (R-3.3.1)
Authors: Anthony Oppong Kyekyeku, Sussana Antwi-Boasiako, Emmanuel De-Graft Johnson Owusu Ansah
Abstract:
In this observational study, the performance of a known conventional storage system was tested and evaluated for fitness for its intended purpose. The system has a scope extended for the storage of dry cocoa beans. System sensitivity, reproducibility and uncertainties are not known in details. This study discusses the system performance in the context of existing literature on factors that influence the quality of cocoa beans during storage. Controlled conditions were defined precisely for the system to give reliable base line within specific established procedures. Minitab® 17 and R statistical software (R-3.3.1) were used for the statistical analyses. The approach to the storage system testing was to observe and compare through laboratory test methods the quality of the cocoa beans samples before and after storage. The samples were kept in Kilner jars and the temperature of the storage environment controlled and monitored over a period of 408 days. Standard test methods use in international trade of cocoa such as the cut test analysis, moisture determination with Aqua boy KAM III model and bean count determination were used for quality assessment. The data analysis assumed the entire population as a sample in order to establish a reliable baseline to the data collected. The study concluded a statistically significant mean value at 95% Confidence Interval (CI) for the performance data analysed before and after storage for all variables observed. Correlational graphs showed a strong positive correlation for all variables investigated with the exception of All Other Defect (AOD). The weak relationship between the before and after data for AOD had an explained variability of 51.8% with the unexplained variability attributable to the uncontrolled condition of hidden infestation before storage. The current study concluded with a high-performance criterion for the storage system.Keywords: benchmarking performance data, cocoa beans, hidden infestation, storage system validation
Procedia PDF Downloads 174599 Implications of Optimisation Algorithm on the Forecast Performance of Artificial Neural Network for Streamflow Modelling
Authors: Martins Y. Otache, John J. Musa, Abayomi I. Kuti, Mustapha Mohammed
Abstract:
The performance of an artificial neural network (ANN) is contingent on a host of factors, for instance, the network optimisation scheme. In view of this, the study examined the general implications of the ANN training optimisation algorithm on its forecast performance. To this end, the Bayesian regularisation (Br), Levenberg-Marquardt (LM), and the adaptive learning gradient descent: GDM (with momentum) algorithms were employed under different ANN structural configurations: (1) single-hidden layer, and (2) double-hidden layer feedforward back propagation network. Results obtained revealed generally that the gradient descent with momentum (GDM) optimisation algorithm, with its adaptive learning capability, used a relatively shorter time in both training and validation phases as compared to the Levenberg- Marquardt (LM) and Bayesian Regularisation (Br) algorithms though learning may not be consummated; i.e., in all instances considering also the prediction of extreme flow conditions for 1-day and 5-day ahead, respectively especially using the ANN model. In specific statistical terms on the average, model performance efficiency using the coefficient of efficiency (CE) statistic were Br: 98%, 94%; LM: 98 %, 95 %, and GDM: 96 %, 96% respectively for training and validation phases. However, on the basis of relative error distribution statistics (MAE, MAPE, and MSRE), GDM performed better than the others overall. Based on the findings, it is imperative to state that the adoption of ANN for real-time forecasting should employ training algorithms that do not have computational overhead like the case of LM that requires the computation of the Hessian matrix, protracted time, and sensitivity to initial conditions; to this end, Br and other forms of the gradient descent with momentum should be adopted considering overall time expenditure and quality of the forecast as well as mitigation of network overfitting. On the whole, it is recommended that evaluation should consider implications of (i) data quality and quantity and (ii) transfer functions on the overall network forecast performance.Keywords: streamflow, neural network, optimisation, algorithm
Procedia PDF Downloads 152598 Evaluating Habitat Manipulation as a Strategy for Rodent Control in Agricultural Ecosystems of Pothwar Region, Pakistan
Authors: Nadeem Munawar, Tariq Mahmood
Abstract:
Habitat manipulation is an important technique that can be used for controlling rodent damage in agricultural ecosystems. It involves intentionally manipulation of vegetation cover in adjacent habitats around the active burrows of rodents to reduce shelter, food availability and to increase predation pressure. The current study was conducted in the Pothwar Plateau during the respective non-crop period of wheat-groundnut (post-harvested and un-ploughed/non-crop fallow lands) with the aim to assess the impact of the reduction in vegetation height of adjacent habitats (field borders) on rodent’s richness and abundance. The study area was divided into two sites viz. treated and non-treated. At the treated sites, habitat manipulation was carried out by removing crop cache, and non-crop vegetation’s over 10 cm in height to a distance of approximately 20 m from the fields. The trapping sessions carried out at both treated and non-treated sites adjacent to wheat-groundnut fields were significantly different (F 2, 6 = 13.2, P = 0.001) from each other, which revealed that a maximum number of rodents were captured from non-treated sites. There was a significant difference in the overall abundance of rodents (P < 0.05) between crop stages and between treatments in both crops. The manipulation effect was significantly observed on damage to crops, and yield production resulted in the reduction of damage within the associated croplands (P < 0.05). The outcomes of this study indicated a significant reduction of rodent population at treated sites due to changes in vegetation height and cover which affect important components, i.e., food, shelter, movements and increased risk sensitivity in their feeding behavior; therefore, they were unable to reach levels where they cause significant crop damage. This method is recommended for being a cost-effective and easy application.Keywords: agricultural ecosystems, crop damage, habitat manipulation, rodents, trapping
Procedia PDF Downloads 165597 Sensing of Cancer DNA Using Resonance Frequency
Authors: Sungsoo Na, Chanho Park
Abstract:
Lung cancer is one of the most common severe diseases driving to the death of a human. Lung cancer can be divided into two cases of small-cell lung cancer (SCLC) and non-SCLC (NSCLC), and about 80% of lung cancers belong to the case of NSCLC. From several studies, the correlation between epidermal growth factor receptor (EGFR) and NSCLCs has been investigated. Therefore, EGFR inhibitor drugs such as gefitinib and erlotinib have been used as lung cancer treatments. However, the treatments result showed low response (10~20%) in clinical trials due to EGFR mutations that cause the drug resistance. Patients with resistance to EGFR inhibitor drugs usually are positive to KRAS mutation. Therefore, assessment of EGFR and KRAS mutation is essential for target therapies of NSCLC patient. In order to overcome the limitation of conventional therapies, overall EGFR and KRAS mutations have to be monitored. In this work, the only detection of EGFR will be presented. A variety of techniques has been presented for the detection of EGFR mutations. The standard detection method of EGFR mutation in ctDNA relies on real-time polymerase chain reaction (PCR). Real-time PCR method provides high sensitive detection performance. However, as the amplification step increases cost effect and complexity increase as well. Other types of technology such as BEAMing, next generation sequencing (NGS), an electrochemical sensor and silicon nanowire field-effect transistor have been presented. However, those technologies have limitations of low sensitivity, high cost and complexity of data analyzation. In this report, we propose a label-free and high-sensitive detection method of lung cancer using quartz crystal microbalance based platform. The proposed platform is able to sense lung cancer mutant DNA with a limit of detection of 1nM.Keywords: cancer DNA, resonance frequency, quartz crystal microbalance, lung cancer
Procedia PDF Downloads 233596 A Computational Approach for the Prediction of Relevant Olfactory Receptors in Insects
Authors: Zaide Montes Ortiz, Jorge Alberto Molina, Alejandro Reyes
Abstract:
Insects are extremely successful organisms. A sophisticated olfactory system is in part responsible for their survival and reproduction. The detection of volatile organic compounds can positively or negatively affect many behaviors in insects. Compounds such as carbon dioxide (CO2), ammonium, indol, and lactic acid are essential for many species of mosquitoes like Anopheles gambiae in order to locate vertebrate hosts. For instance, in A. gambiae, the olfactory receptor AgOR2 is strongly activated by indol, which accounts for almost 30% of human sweat. On the other hand, in some insects of agricultural importance, the detection and identification of pheromone receptors (PRs) in lepidopteran species has become a promising field for integrated pest management. For example, with the disruption of the pheromone receptor, BmOR1, mediated by transcription activator-like effector nucleases (TALENs), the sensitivity to bombykol was completely removed affecting the pheromone-source searching behavior in male moths. Then, the detection and identification of olfactory receptors in the genomes of insects is fundamental to improve our understanding of the ecological interactions, and to provide alternatives in the integrated pests and vectors management. Hence, the objective of this study is to propose a bioinformatic workflow to enhance the detection and identification of potential olfactory receptors in genomes of relevant insects. Applying Hidden Markov models (Hmms) and different computational tools, potential candidates for pheromone receptors in Tuta absoluta were obtained, as well as potential carbon dioxide receptors in Rhodnius prolixus, the main vector of Chagas disease. This study showed the validity of a bioinformatic workflow with a potential to improve the identification of certain olfactory receptors in different orders of insects.Keywords: bioinformatic workflow, insects, olfactory receptors, protein prediction
Procedia PDF Downloads 149595 Luminescent Functionalized Graphene Oxide Based Sensitive Detection of Deadly Explosive TNP
Authors: Diptiman Dinda, Shyamal Kumar Saha
Abstract:
In the 21st century, sensitive and selective detection of trace amounts of explosives has become a serious problem. Generally, nitro compound and its derivatives are being used worldwide to prepare different explosives. Recently, TNP (2, 4, 6 trinitrophenol) is the most commonly used constituent to prepare powerful explosives all over the world. It is even powerful than TNT or RDX. As explosives are electron deficient in nature, it is very difficult to detect one separately from a mixture. Again, due to its tremendous water solubility, detection of TNP in presence of other explosives from water is very challenging. Simple instrumentation, cost-effective, fast and high sensitivity make fluorescence based optical sensing a grand success compared to other techniques. Graphene oxide (GO), with large no of epoxy grps, incorporate localized nonradiative electron-hole centres on its surface to give very weak fluorescence. In this work, GO is functionalized with 2, 6-diamino pyridine to remove those epoxy grps. through SN2 reaction. This makes GO into a bright blue luminescent fluorophore (DAP/rGO) which shows an intense PL spectrum at ∼384 nm when excited at 309 nm wavelength. We have also characterized the material by FTIR, XPS, UV, XRD and Raman measurements. Using this as fluorophore, a large fluorescence quenching (96%) is observed after addition of only 200 µL of 1 mM TNP in water solution. Other nitro explosives give very moderate PL quenching compared to TNP. Such high selectivity is related to the operation of FRET mechanism from fluorophore to TNP during this PL quenching experiment. TCSPC measurement also reveals that the lifetime of DAP/rGO drastically decreases from 3.7 to 1.9 ns after addition of TNP. Our material is also quite sensitive to 125 ppb level of TNP. Finally, we believe that this graphene based luminescent material will emerge a new class of sensing materials to detect trace amounts of explosives from aqueous solution.Keywords: graphene, functionalization, fluorescence quenching, FRET, nitroexplosive detection
Procedia PDF Downloads 440594 Nanomaterial Based Electrochemical Sensors for Endocrine Disrupting Compounds
Authors: Gaurav Bhanjana, Ganga Ram Chaudhary, Sandeep Kumar, Neeraj Dilbaghi
Abstract:
Main sources of endocrine disrupting compounds in the ecosystem are hormones, pesticides, phthalates, flame retardants, dioxins, personal-care products, coplanar polychlorinated biphenyls (PCBs), bisphenol A, and parabens. These endocrine disrupting compounds are responsible for learning disabilities, brain development problems, deformations of the body, cancer, reproductive abnormalities in females and decreased sperm count in human males. Although discharge of these chemical compounds into the environment cannot be stopped, yet their amount can be retarded through proper evaluation and detection techniques. The available techniques for determination of these endocrine disrupting compounds mainly include high performance liquid chromatography (HPLC), mass spectroscopy (MS) and gas chromatography-mass spectrometry (GC–MS). These techniques are accurate and reliable but have certain limitations like need of skilled personnel, time consuming, interference and requirement of pretreatment steps. Moreover, these techniques are laboratory bound and sample is required in large amount for analysis. In view of above facts, new methods for detection of endocrine disrupting compounds should be devised that promise high specificity, ultra sensitivity, cost effective, efficient and easy-to-operate procedure. Nowadays, electrochemical sensors/biosensors modified with nanomaterials are gaining high attention among researchers. Bioelement present in this system makes the developed sensors selective towards analyte of interest. Nanomaterials provide large surface area, high electron communication feature, enhanced catalytic activity and possibilities of chemical modifications. In most of the cases, nanomaterials also serve as an electron mediator or electrocatalyst for some analytes.Keywords: electrochemical, endocrine disruptors, microscopy, nanoparticles, sensors
Procedia PDF Downloads 273593 Meet Automotive Software Safety and Security Standards Expectations More Quickly
Authors: Jean-François Pouilly
Abstract:
This study addresses the growing complexity of embedded systems and the critical need for secure, reliable software. Traditional cybersecurity testing methods, often conducted late in the development cycle, struggle to keep pace. This talk explores how formal methods, integrated with advanced analysis tools, empower C/C++ developers to 1) Proactively address vulnerabilities and bugs, which includes formal methods and abstract interpretation techniques to identify potential weaknesses early in the development process, reducing the reliance on penetration and fuzz testing in later stages. 2) Streamline development by focusing on bugs that matter, with close to no false positives and catching flaws earlier, the need for rework and retesting is minimized, leading to faster development cycles, improved efficiency and cost savings. 3) Enhance software dependability which includes combining static analysis using abstract interpretation with full context sensitivity, with hardware memory awareness allows for a more comprehensive understanding of potential vulnerabilities, leading to more dependable and secure software. This approach aligns with industry best practices (ISO2626 or ISO 21434) and empowers C/C++ developers to deliver robust, secure embedded systems that meet the demands of today's and tomorrow's applications. We will illustrate this approach with the TrustInSoft analyzer to show how it accelerates verification for complex cases, reduces user fatigue, and improves developer efficiency, cost-effectiveness, and software cybersecurity. In summary, integrating formal methods and sound Analyzers enhances software reliability and cybersecurity, streamlining development in an increasingly complex environment.Keywords: safety, cybersecurity, ISO26262, ISO24434, formal methods
Procedia PDF Downloads 19592 Quantitative Evaluation of Supported Catalysts Key Properties from Electron Tomography Studies: Assessing Accuracy Using Material-Realistic 3D-Models
Authors: Ainouna Bouziane
Abstract:
The ability of Electron Tomography to recover the 3D structure of catalysts, with spatial resolution in the subnanometer scale, has been widely explored and reviewed in the last decades. A variety of experimental techniques, based either on Transmission Electron Microscopy (TEM) or Scanning Transmission Electron Microscopy (STEM) have been used to reveal different features of nanostructured catalysts in 3D, but High Angle Annular Dark Field imaging in STEM mode (HAADF-STEM) stands out as the most frequently used, given its chemical sensitivity and avoidance of imaging artifacts related to diffraction phenomena when dealing with crystalline materials. In this regard, our group has developed a methodology that combines image denoising by undecimated wavelet transforms (UWT) with automated, advanced segmentation procedures and parameter selection methods using CS-TVM (Compressed Sensing-total variation minimization) algorithms to reveal more reliable quantitative information out of the 3D characterization studies. However, evaluating the accuracy of the magnitudes estimated from the segmented volumes is also an important issue that has not been properly addressed yet, because a perfectly known reference is needed. The problem particularly complicates in the case of multicomponent material systems. To tackle this key question, we have developed a methodology that incorporates volume reconstruction/segmentation methods. In particular, we have established an approach to evaluate, in quantitative terms, the accuracy of TVM reconstructions, which considers the influence of relevant experimental parameters like the range of tilt angles, image noise level or object orientation. The approach is based on the analysis of material-realistic, 3D phantoms, which include the most relevant features of the system under analysis.Keywords: electron tomography, supported catalysts, nanometrology, error assessment
Procedia PDF Downloads 88591 Domain-Specific Deep Neural Network Model for Classification of Abnormalities on Chest Radiographs
Authors: Nkechinyere Joy Olawuyi, Babajide Samuel Afolabi, Bola Ibitoye
Abstract:
This study collected a preprocessed dataset of chest radiographs and formulated a deep neural network model for detecting abnormalities. It also evaluated the performance of the formulated model and implemented a prototype of the formulated model. This was with the view to developing a deep neural network model to automatically classify abnormalities in chest radiographs. In order to achieve the overall purpose of this research, a large set of chest x-ray images were sourced for and collected from the CheXpert dataset, which is an online repository of annotated chest radiographs compiled by the Machine Learning Research Group, Stanford University. The chest radiographs were preprocessed into a format that can be fed into a deep neural network. The preprocessing techniques used were standardization and normalization. The classification problem was formulated as a multi-label binary classification model, which used convolutional neural network architecture to make a decision on whether an abnormality was present or not in the chest radiographs. The classification model was evaluated using specificity, sensitivity, and Area Under Curve (AUC) score as the parameter. A prototype of the classification model was implemented using Keras Open source deep learning framework in Python Programming Language. The AUC ROC curve of the model was able to classify Atelestasis, Support devices, Pleural effusion, Pneumonia, A normal CXR (no finding), Pneumothorax, and Consolidation. However, Lung opacity and Cardiomegaly had a probability of less than 0.5 and thus were classified as absent. Precision, recall, and F1 score values were 0.78; this implies that the number of False Positive and False Negative is the same, revealing some measure of label imbalance in the dataset. The study concluded that the developed model is sufficient to classify abnormalities present in chest radiographs into present or absent.Keywords: transfer learning, convolutional neural network, radiograph, classification, multi-label
Procedia PDF Downloads 129590 Mondoc: Informal Lightweight Ontology for Faceted Semantic Classification of Hypernymy
Authors: M. Regina Carreira-Lopez
Abstract:
Lightweight ontologies seek to concrete union relationships between a parent node, and a secondary node, also called "child node". This logic relation (L) can be formally defined as a triple ontological relation (LO) equivalent to LO in ⟨LN, LE, LC⟩, and where LN represents a finite set of nodes (N); LE is a set of entities (E), each of which represents a relationship between nodes to form a rooted tree of ⟨LN, LE⟩; and LC is a finite set of concepts (C), encoded in a formal language (FL). Mondoc enables more refined searches on semantic and classified facets for retrieving specialized knowledge about Atlantic migrations, from the Declaration of Independence of the United States of America (1776) and to the end of the Spanish Civil War (1939). The model looks forward to increasing documentary relevance by applying an inverse frequency of co-ocurrent hypernymy phenomena for a concrete dataset of textual corpora, with RMySQL package. Mondoc profiles archival utilities implementing SQL programming code, and allows data export to XML schemas, for achieving semantic and faceted analysis of speech by analyzing keywords in context (KWIC). The methodology applies random and unrestricted sampling techniques with RMySQL to verify the resonance phenomena of inverse documentary relevance between the number of co-occurrences of the same term (t) in more than two documents of a set of texts (D). Secondly, the research also evidences co-associations between (t) and their corresponding synonyms and antonyms (synsets) are also inverse. The results from grouping facets or polysemic words with synsets in more than two textual corpora within their syntagmatic context (nouns, verbs, adjectives, etc.) state how to proceed with semantic indexing of hypernymy phenomena for subject-heading lists and for authority lists for documentary and archival purposes. Mondoc contributes to the development of web directories and seems to achieve a proper and more selective search of e-documents (classification ontology). It can also foster on-line catalogs production for semantic authorities, or concepts, through XML schemas, because its applications could be used for implementing data models, by a prior adaptation of the based-ontology to structured meta-languages, such as OWL, RDF (descriptive ontology). Mondoc serves to the classification of concepts and applies a semantic indexing approach of facets. It enables information retrieval, as well as quantitative and qualitative data interpretation. The model reproduces a triple tuple ⟨LN, LE, LT, LCF L, BKF⟩ where LN is a set of entities that connect with other nodes to concrete a rooted tree in ⟨LN, LE⟩. LT specifies a set of terms, and LCF acts as a finite set of concepts, encoded in a formal language, L. Mondoc only resolves partial problems of linguistic ambiguity (in case of synonymy and antonymy), but neither the pragmatic dimension of natural language nor the cognitive perspective is addressed. To achieve this goal, forthcoming programming developments should target at oriented meta-languages with structured documents in XML.Keywords: hypernymy, information retrieval, lightweight ontology, resonance
Procedia PDF Downloads 125589 An Automated Stock Investment System Using Machine Learning Techniques: An Application in Australia
Authors: Carol Anne Hargreaves
Abstract:
A key issue in stock investment is how to select representative features for stock selection. The objective of this paper is to firstly determine whether an automated stock investment system, using machine learning techniques, may be used to identify a portfolio of growth stocks that are highly likely to provide returns better than the stock market index. The second objective is to identify the technical features that best characterize whether a stock’s price is likely to go up and to identify the most important factors and their contribution to predicting the likelihood of the stock price going up. Unsupervised machine learning techniques, such as cluster analysis, were applied to the stock data to identify a cluster of stocks that was likely to go up in price – portfolio 1. Next, the principal component analysis technique was used to select stocks that were rated high on component one and component two – portfolio 2. Thirdly, a supervised machine learning technique, the logistic regression method, was used to select stocks with a high probability of their price going up – portfolio 3. The predictive models were validated with metrics such as, sensitivity (recall), specificity and overall accuracy for all models. All accuracy measures were above 70%. All portfolios outperformed the market by more than eight times. The top three stocks were selected for each of the three stock portfolios and traded in the market for one month. After one month the return for each stock portfolio was computed and compared with the stock market index returns. The returns for all three stock portfolios was 23.87% for the principal component analysis stock portfolio, 11.65% for the logistic regression portfolio and 8.88% for the K-means cluster portfolio while the stock market performance was 0.38%. This study confirms that an automated stock investment system using machine learning techniques can identify top performing stock portfolios that outperform the stock market.Keywords: machine learning, stock market trading, logistic regression, cluster analysis, factor analysis, decision trees, neural networks, automated stock investment system
Procedia PDF Downloads 157588 Analysis of Two-Echelon Supply Chain with Perishable Items under Stochastic Demand
Authors: Saeed Poormoaied
Abstract:
Perishability and developing an intelligent control policy for perishable items are the major concerns of marketing managers in a supply chain. In this study, we address a two-echelon supply chain problem for perishable items with a single vendor and a single buyer. The buyer adopts an aged-based continuous review policy which works by taking both the stock level and the aging process of items into account. The vendor works under the warehouse framework, where its lot size is determined with respect to the batch size of the buyer. The model holds for a positive and fixed lead time for the buyer, and zero lead time for the vendor. The demand follows a Poisson process and any unmet demand is lost. We provide exact analytic expressions for the operational characteristics of the system by using the renewal reward theorem. Items have a fixed lifetime after which they become unusable and are disposed of from the buyer's system. The age of items starts when they are unpacked and ready for the consumption at the buyer. When items are held by the vendor, there is no aging process which results in no perishing at the vendor's site. The model is developed under the centralized framework, which takes the expected profit of both vendor and buyer into consideration. The goal is to determine the optimal policy parameters under the service level constraint at the retailer's site. A sensitivity analysis is performed to investigate the effect of the key input parameters on the expected profit and order quantity in the supply chain. The efficiency of the proposed age-based policy is also evaluated through a numerical study. Our results show that when the unit perishing cost is negligible, a significant cost saving is achieved.Keywords: two-echelon supply chain, perishable items, age-based policy, renewal reward theorem
Procedia PDF Downloads 144587 Nanowire Sensor Based on Novel Impedance Spectroscopy Approach
Authors: Valeriy M. Kondratev, Ekaterina A. Vyacheslavova, Talgat Shugabaev, Alexander S. Gudovskikh, Alexey D. Bolshakov
Abstract:
Modern sensorics imposes strict requirements on the biosensors characteristics, especially technological feasibility, and selectivity. There is a growing interest in the analysis of human health biological markers, which indirectly testifying the pathological processes in the body. Such markers are acids and alkalis produced by the human, in particular - ammonia and hydrochloric acid, which are found in human sweat, blood, and urine, as well as in gastric juice. Biosensors based on modern nanomaterials, especially low dimensional, can be used for this markers detection. Most classical adsorption sensors based on metal and silicon oxides are considered non-selective, because they identically change their electrical resistance (or impedance) under the action of adsorption of different target analytes. This work demonstrates a feasible frequency-resistive method of electrical impedance spectroscopy data analysis. The approach allows to obtain of selectivity in adsorption sensors of a resistive type. The method potential is demonstrated with analyzis of impedance spectra of silicon nanowires in the presence of NH3 and HCl vapors with concentrations of about 125 mmol/L (2 ppm) and water vapor. We demonstrate the possibility of unambiguous distinction of the sensory signal from NH3 and HCl adsorption. Moreover, the method is found applicable for analysis of the composition of ammonia and hydrochloric acid vapors mixture without water cross-sensitivity. Presented silicon sensor can be used to find diseases of the gastrointestinal tract by the qualitative and quantitative detection of ammonia and hydrochloric acid content in biological samples. The method of data analysis can be directly translated to other nanomaterials to analyze their applicability in the field of biosensory.Keywords: electrical impedance spectroscopy, spectroscopy data analysis, selective adsorption sensor, nanotechnology
Procedia PDF Downloads 114586 Infrared Photodetectors Based on Nanowire Arrays: Towards Far Infrared Region
Authors: Mohammad Karimi, Magnus Heurlin, Lars Samuelson, Magnus Borgstrom, Hakan Pettersson
Abstract:
Nanowire semiconductors are promising candidates for optoelectronic applications such as solar cells, photodetectors and lasers due to their quasi-1D geometry and large surface to volume ratio. The functional wavelength range of NW-based detectors is typically limited to the visible/near-infrared region. In this work, we present electrical and optical properties of IR photodetectors based on large square millimeter ensembles (>1million) of vertically processed semiconductor heterostructure nanowires (NWs) grown on InP substrates which operate in longer wavelengths. InP NWs comprising single or multiple (20) InAs/InAsP QDics axially embedded in an n-i-n geometry, have been grown on InP substrates using metal organic vapor phase epitaxy (MOVPE). The NWs are contacted in vertical direction by atomic layer deposition (ALD) deposition of 50 nm SiO2 as an insulating layer followed by sputtering of indium tin oxide (ITO) and evaporation of Ti and Au as top contact layer. In order to extend the sensitivity range to the mid-wavelength and long-wavelength regions, the intersubband transition within conduction band of InAsP QDisc is suggested. We present first experimental indications of intersubband photocurrent in NW geometry and discuss important design parameters for realization of intersubband detectors. Key advantages with the proposed design include large degree of freedom in choice of materials compositions, possible enhanced optical resonance effects due to periodically ordered NW arrays and the compatibility with silicon substrates. We believe that the proposed detector design offers the route towards monolithic integration of compact and sensitive III-V NW long wavelength detectors with Si technology.Keywords: intersubband photodetector, infrared, nanowire, quantum disc
Procedia PDF Downloads 386585 Insulin Resistance in Patients with Chronic Hepatitis C Virus Infection: Upper Egypt Experience
Authors: Ali Kassem
Abstract:
Background: In the last few years, factors such as insulin resistance (IR) and hepatic steatosis have been linked to progression of hepatic fibrosis.Patients with chronic liver disease, and cirrhosis in particular, are known to be prone to IR. However, chronic HCV (hepatitis C) infection may induce IR, regardless of the presence of liver cirrhosis. Our aims are to study insulin resistance (IR) assessed by HOMA-IR (Homeostatic Model Assessment Insulin Resistance) as a possible risk factor in disease progression in cirrhotic patients and to evaluate the role of IR in hepatic fibrosis progression. The correlations of HOMA-IR values to laboratory, virological and histopathological parameters of chronic HCV are also examined. Methods: The study included 50 people divided into 30 adult chronic hepatitis C patients diagnosed by PCR (polymerase chain reaction) within previous 6 months and 20 healthy controls. The functional and morphological status of the liver were evaluated by ultrasonography and laboratory investigations including liver function tests and by liver biopsy. Fasting blood glucose and fasting insulin levels were measured and body mass index and insulin resistance were calculated. Patients having HOMA-IR >2.5 were labeled as insulin resistant. Results: Chronic hepatitis C patients with IR showed significantly higher mean values of BMI (body mass index) and fasting insulin than those without IR (P < 0.000). Patients with IR were more likely to have steatosis (p = 0.006), higher necroinflammatory activity (p = 0.05). No significant differences were found between the two groups regarding hepatic fibrosis. Conclusion: HOMA-IR measurement could represent a novel marker to identify the cirrhotic patients at greater risk for the progression of liver disease. As IR is a potentially modifiable risk factor, these findings may have important prognostic and therapeutic implications. Assessment of IR by HOMA-IR and improving insulin sensitivity are recommended in patients with HCV and related chronic liver disease.Keywords: hepatic fibrosis, hepatitis C virus infection, hepatic steatosis, insulin resistance
Procedia PDF Downloads 154584 Uterine Cervical Cancer; Early Treatment Assessment with T2- And Diffusion-Weighted MRI
Authors: Susanne Fridsten, Kristina Hellman, Anders Sundin, Lennart Blomqvist
Abstract:
Background: Patients diagnosed with locally advanced cervical carcinoma are treated with definitive concomitant chemo-radiotherapy. Treatment failure occurs in 30-50% of patients with very poor prognoses. The treatment is standardized with risk for both over-and undertreatment. Consequently, there is a great need for biomarkers able to predict therapy outcomes to allow for individualized treatment. Aim: To explore the role of T2- and diffusion-weighted magnetic resonance imaging (MRI) for early prediction of therapy outcome and the optimal time point for assessment. Methods: A pilot study including 15 patients with cervical carcinoma stage IIB-IIIB (FIGO 2009) undergoing definitive chemoradiotherapy. All patients underwent MRI four times, at baseline, 3 weeks, 5 weeks, and 12 weeks after treatment started. Tumour size, size change (∆size), visibility on diffusion-weighted imaging (DWI), apparent diffusion coefficient (ADC) and change of ADC (∆ADC) at the different time points were recorded. Results: 7/15 patients relapsed during the study period, referred to as "poor prognosis", PP, and the remaining eight patients are referred to "good prognosis", GP. The tumor size was larger at all time points for PP than for GP. The ∆size between any of the four-time points was the same for PP and GP patients. The sensitivity and specificity to predict prognostic group depending on a remaining tumor on DWI were highest at 5 weeks and 83% (5/6) and 63% (5/8), respectively. The combination of tumor size at baseline and remaining tumor on DWI at 5 weeks in ROC analysis reached an area under the curve (AUC) of 0.83. After 12 weeks, no remaining tumor was seen on DWI among patients with GP, as opposed to 2/7 PP patients. Adding ADC to the tumor size measurements did not improve the predictive value at any time point. Conclusion: A large tumor at baseline MRI combined with a remaining tumor on DWI at 5 weeks predicted a poor prognosis.Keywords: chemoradiotherapy, diffusion-weighted imaging, magnetic resonance imaging, uterine cervical carcinoma
Procedia PDF Downloads 143583 Permeable Bio-Reactive Barriers to Tackle Petroleum Hydrocarbon Contamination in the Sub-Antarctic
Authors: Benjamin L. Freidman, Sally L. Gras, Ian Snape, Geoff W. Stevens, Kathryn A. Mumford
Abstract:
Increasing transportation and storage of petroleum hydrocarbons in Antarctic and sub-Antarctic regions have resulted in frequent accidental spills. Migrating petroleum hydrocarbon spills can have a significant impact on terrestrial and marine ecosystems in cold regions, as harsh environmental conditions result in heightened sensitivity to pollution. This migration of contaminants has led to the development of Permeable Reactive Barriers (PRB) for application in cold regions. PRB’s are one of the most practical technologies for on-site or in-situ groundwater remediation in cold regions due to their minimal energy, monitoring and maintenance requirements. The Main Power House site has been used as a fuel storage and power generation area for the Macquarie Island research station since at least 1960. Soil analysis at the site has revealed Total Petroleum Hydrocarbon (TPH) (C9-C28) concentrations as high as 19,000 mg/kg soil. Groundwater TPH concentrations at this site can exceed 350 mg/L TPH. Ongoing migration of petroleum hydrocarbons into the neighbouring marine ecosystem resulted in the installation of a ‘funnel and gate’ PRB in November 2014. The ‘funnel and gate’ design successfully intercepted contaminated groundwater and analysis of TPH retention and biodegradation on PRB media are currently underway. Installation of the PRB facilitates research aimed at better understanding the contribution of particle attached biofilms to the remediation of groundwater systems. Bench-scale PRB system analysis at The University of Melbourne is currently examining the role biofilms play in petroleum hydrocarbon degradation, and how controlled release nutrient media can heighten the metabolic activity of biofilms in cold regions in the presence of low temperatures and low nutrient groundwater.Keywords: groundwater, petroleum, Macquarie island, funnel and gate
Procedia PDF Downloads 358582 Evaluation of the Heating Capability and in vitro Hemolysis of Nanosized MgxMn1-xFe2O4 (x = 0.3 and 0.4) Ferrites Prepared by Sol-gel Method
Authors: Laura Elena De León Prado, Dora Alicia Cortés Hernández, Javier Sánchez
Abstract:
Among the different cancer treatments that are currently used, hyperthermia has a promising potential due to the multiple benefits that are obtained by this technique. In general terms, hyperthermia is a method that takes advantage of the sensitivity of cancer cells to heat, in order to damage or destroy them. Within the different ways of supplying heat to cancer cells and achieve their destruction or damage, the use of magnetic nanoparticles has attracted attention due to the capability of these particles to generate heat under the influence of an external magnetic field. In addition, these nanoparticles have a high surface area and sizes similar or even lower than biological entities, which allow their approaching and interaction with a specific region of interest. The most used magnetic nanoparticles for hyperthermia treatment are those based on iron oxides, mainly magnetite and maghemite, due to their biocompatibility, good magnetic properties and chemical stability. However, in order to fulfill more efficiently the requirements that demand the treatment of magnetic hyperthermia, there have been investigations using ferrites that incorporate different metallic ions, such as Mg, Mn, Co, Ca, Ni, Cu, Li, Gd, etc., in their structure. This paper reports the synthesis of nanosized MgxMn1-xFe2O4 (x = 0.3 and 0.4) ferrites by sol-gel method and their evaluation in terms of heating capability and in vitro hemolysis to determine the potential use of these nanoparticles as thermoseeds for the treatment of cancer by magnetic hyperthermia. It was possible to obtain ferrites with nanometric sizes, a single crystalline phase with an inverse spinel structure and a behavior near to that of superparamagnetic materials. Additionally, at concentrations of 10 mg of magnetic material per mL of water, it was possible to reach a temperature of approximately 45°C, which is within the range of temperatures used for the treatment of hyperthermia. The results of the in vitro hemolysis assay showed that, at the concentrations tested, these nanoparticles are non-hemolytic, as their percentage of hemolysis is close to zero. Therefore, these materials can be used as thermoseeds for the treatment of cancer by magnetic hyperthermia.Keywords: ferrites, heating capability, hemolysis, nanoparticles, sol-gel
Procedia PDF Downloads 343581 Combining in vitro Protein Expression with AlphaLISA Technology to Study Protein-Protein Interaction
Authors: Shayli Varasteh Moradi, Wayne A. Johnston, Dejan Gagoski, Kirill Alexandrov
Abstract:
The demand for a rapid and more efficient technique to identify protein-protein interaction particularly in the areas of therapeutics and diagnostics development is growing. The method described here is a rapid in vitro protein-protein interaction analysis approach based on AlphaLISA technology combined with Leishmania tarentolae cell-free protein production (LTE) system. Cell-free protein synthesis allows the rapid production of recombinant proteins in a multiplexed format. Among available in vitro expression systems, LTE offers several advantages over other eukaryotic cell-free systems. It is based on a fast growing fermentable organism that is inexpensive in cultivation and lysate production. High integrity of proteins produced in this system and the ability to co-express multiple proteins makes it a desirable method for screening protein interactions. Following the translation of protein pairs in LTE system, the physical interaction between proteins of interests is analysed by AlphaLISA assay. The assay is performed using unpurified in vitro translation reaction and therefore can be readily multiplexed. This approach can be used in various research applications such as epitope mapping, antigen-antibody analysis and protein interaction network mapping. The intra-viral protein interaction network of Zika virus was studied using the developed technique. The viral proteins were co-expressed pair-wise in LTE and all possible interactions among viral proteins were tested using AlphaLISA. The assay resulted to the identification of 54 intra-viral protein-protein interactions from which 19 binary interactions were found to be novel. The presented technique provides a powerful tool for rapid analysis of protein-protein interaction with high sensitivity and throughput.Keywords: AlphaLISA technology, cell-free protein expression, epitope mapping, Leishmania tarentolae, protein-protein interaction
Procedia PDF Downloads 237580 An Empirical Study on the Impact of Peace in Tourists' Country of Origin on Their Travel Behavior
Authors: Claudia Seabra, Elisabeth Kastenholz, José Luís Abrantes, Manuel Reis
Abstract:
In a world of increasing mobility and global risks, terrorism has, in a perverse way, capitalized on contemporaneous society’s growing interest in travel to explore a world whose national boundaries and distances have decreased. Terrorists have identified the modern tourist flows originated from the economically more developed countries as new appealing targets so as to: i) call attention to the causes they defend and ii) destroy a country’s foundations of tourism, with the final aim of disrupting the economic and consequently social fabric of the affected countries. The present study analyses sensitivity towards risk and travel behaviors in international travel amongst a sample of 600 international tourists from 49 countries travelling by air. Specifically, the sample was segmented according to the Global Peace Index. This index defines country profiles regarding the levels of peace. The indicators used are established over three broad themes: i) ongoing domestic and international conflict; ii) societal safety and security; and iii) militarisation. Tourists were segmented, according to their country of origin, in different levels of peacefulness. Several facets of travel behavior were evaluated, namely motivations, attitude towards trip planning, quality perception and perceived value of the trip. Also factors related with risk perception were evaluated, specifically terrorism risk perception during the trip, unsafety sensation as well as importance attributed to safety in travel. Results contribute to our understanding of the role of previous exposure to the lack of peace and safety at home in the international tourists behaviors, which is further discussed in terms of tourism management and marketing implications which should particularly interest tourism services and destinations more affected by terrorism, war, political turmoil, crime and other safety risks.Keywords: terrorism, tourism, safety, risk perception
Procedia PDF Downloads 441579 Cicadas: A Clinician-assisted, Closed-loop Technology, Mobile App for Adolescents with Autism Spectrum Disorders
Authors: Bruno Biagianti, Angela Tseng, Kathy Wannaviroj, Allison Corlett, Megan DuBois, Kyu Lee, Suma Jacob
Abstract:
Background: ASD is characterized by pervasive Sensory Processing Abnormalities (SPA) and social cognitive deficits that persist throughout the course of the illness and have been linked to functional abnormalities in specific neural systems that underlie the perception, processing, and representation of sensory information. SPA and social cognitive deficits are associated with difficulties in interpersonal relationships, poor development of social skills, reduced social interactions and lower academic performance. Importantly, they can hamper the effects of established evidence-based psychological treatments—including PEERS (Program for the Education and Enrichment of Relationship Skills), a parent/caregiver-assisted, 16-weeks social skills intervention—which nonetheless requires a functional brain capable of assimilating and retaining information and skills. As a matter of fact, some adolescents benefit from PEERS more than others, calling for strategies to increase treatment response rates. Objective: We will present interim data on CICADAS (Care Improving Cognition for ADolescents on the Autism Spectrum)—a clinician-assisted, closed-loop technology mobile application for adolescents with ASD. Via ten mobile assessments, CICADAS captures data on sensory processing abnormalities and associated cognitive deficits. These data populate a machine learning algorithm that tailors the delivery of ten neuroplasticity-based social cognitive training (NB-SCT) exercises targeting sensory processing abnormalities. Methods: In collaboration with the Autism Spectrum and Neurodevelopmental Disorders Clinic at the University of Minnesota, we conducted a fully remote, three-arm, randomized crossover trial with adolescents with ASD to document the acceptability of CICADAS and evaluate its potential as a stand-alone treatment or as a treatment enhancer of PEERS. Twenty-four adolescents with ASD (ages 11-18) have been initially randomized to 16 weeks of PEERS + CICADAS (Arm A) vs. 16 weeks of PEERS + computer games vs. 16 weeks of CICADAS alone (Arm C). After 16 weeks, the full battery of assessments has been remotely administered. Results: We have evaluated the acceptability of CICADAS by examining adherence rates, engagement patterns, and exit survey data. We found that: 1) CICADAS is able to serve as a treatment enhancer for PEERS, inducing greater improvements in sensory processing, cognition, symptom reduction, social skills and behaviors, as well as the quality of life compared to computer games; 2) the concurrent delivery of PEERS and CICADAS induces greater improvements in study outcomes compared to CICADAS only. Conclusion: While preliminary, our results indicate that the individualized assessment and treatment approach designed in CICADAS seems effective in inducing adaptive long-term learning about social-emotional events. CICADAS-induced enhancement of processing and cognition facilitates the application of PEERS skills in the environment of adolescents with ASD, thus improving their real-world functioning.Keywords: ASD, social skills, cognitive training, mobile app
Procedia PDF Downloads 214578 Biospiral-Detect to Distinguish PrP Multimers from Monomers
Authors: Gulyas Erzsebet
Abstract:
The multimerisation of proteins is a common feature of many cellular processes; however, it could also impair protein functions and/or be associated with the occurrence of diseases. Thus, development of a research tool monitoring the appearance/presence of multimeric protein forms has great importance for a variety of research fields. Such a tool is potentially applicable in the ante-mortem diagnosis of certain conformational diseases, such as transmissible spongiform encephalopathies (TSE) and Alzheimer’s disease. These conditions are accompanied by the appearance of aggregated protein multimers, present in low concentrations in various tissues. This detection is particularly relevant for TSE where the handling of tissues derived from affected individuals and of meat products of infected animals have become an enormous health concern. Here we demonstrate the potential of such a multimer detection approach in TSE by developing a facile approach. The Biospiral-Detect system resembles a traditional sandwich ELISA, except that the capturing antibody that is attached to a solid surface and the detecting antibody is directed against the same or overlapping epitopes. As a consequence, the capturing antibody shields the epitope on the captured monomer from reacting with the detecting antibody, therefore monomers are not detected. Thus, MDS is capable of detecting only protein multimers with high specificity. We developed an alternative system as well, where RNA aptamers were employed instead of monoclonal antibodies. In order to minimize degradation, the 3' and 5' ends of the aptamer contained deoxyribonucleotides and phosphorothioate linkages. When compared the monoclonal antibodies-based system with the aptamers-based one, the former proved to be superior. Thus all subsequent experiments were conducted by employing the Biospiral -Detect modified sandwich ELISA kit. Our approach showed an order of magnitude higher sensitivity toward mulimers than monomers suggesting that this approach may become a valuable diagnostic tool for conformational diseases that are accompanied by multimerization.Keywords: diagnosis, ELISA, Prion, TSE
Procedia PDF Downloads 251577 Effect of Endurance Exercise Training on Blood Pressure in Elderly Female Patients with Hypertension
Authors: Elham Ahmadi
Abstract:
This study is conducted with the aim of investigating the effect of moderate physical activity (60% of maximal heart rate-MHR) on blood pressure in an elderly female with hypertension. Hypertension is considered a modifiable risk factor for cardiovascular disease through physical activity. The purpose and significance of this study were to investigate the role of exercise as an alternative therapy since some patients exhibit sensitivity/intolerance to some drugs. Initially, 65 hypertensive females (average age = 49.7 years) (systolic blood pressure, SBP >140 mmHg and/or diastolic blood pressure, DBP>85 mmHg) and 25 hypertensive females as a control group (average age = 50.3 years and systolic blood pressure, SBP >140 mmHg and/or diastolic blood pressure, DBP>85 mmHg) were selected. The subjects were divided based on their age, duration of disease, physical activity, and drug consumption. Then, blood pressure and heart rate (HR) were measured in all of the patients using a sphygmomanometer (pre-test). The exercise sessions consisted of warm-up, aerobic activity, and cooling down (total duration of 20 minutes for the first session up to 55 minutes in the last session). At the end of the 12th session (mid-test) and final session (24th session), blood pressure was measured for the last time (post-test). The control group was without any exercise during the study. The results were analyzed using a t-test. Our results indicated that moderate physical activity was effective in lowering blood pressure by 6.4/5.6–mm Hg for SBP and 2.4/4.3mm Hg for DBP in hypertensive patients, irrespective of age, duration of disease, and drug consumption ( P<.005). The control group indicates no changes in BP. Physical activity programs with moderate intensity (approximately at 60% MHR), three days per week, can be used not only as a preventive measure for diastolic hypertension (DBP>90 mmHg high blood pressure) but also as an alternative to drug therapy in the treatment of hypertension, as well.Keywords: endurance exercise, elderly female, hypertension, physical activity
Procedia PDF Downloads 97576 Human–Wildlife Conflicts in Selected Areas of Azad Jammu and Kashmir, Pakistan
Authors: Nausheen Irshad
Abstract:
Human-wildlife conflict (HWC) exists in both developed and developing countries though it is more serious in developing nations. Knowledge of species ecology and species sensitivity to anthropogenic pressures is an important prerequisite for conservation/management. Therefore, three districts (Poonch, Bagh, and Muzaffarabad) of Azad Jammu and Kashmir were selected to highlight the wildlife hunting practices from January 2015 to November 2018. The study area was thoroughly explored to recover dead animals. Moreover, the local community was investigated (questionnaire survey) to catch on motives of killing. The results showed HWC mainly arises due to feeding habits of wild animals as some are frugivorous (small Indian civet and small Kashmir flying squirrel) who damaged human cultivated fruit trees. Besides, Indian crested porcupine and wild boar act as serious crop pests. The feeding upon domestic animals (common leopard) and poultry (Asiatic Jackal and Red fox) were also reported as factors of conflict. Hence numerous wild animals and birds (N=120) were found killed by natives in revenge. Despite protected status in Pakistan, the killed mammals belonged to categories of critically endangered (Panthera pardus) and near threatened (Viverricula indica) species. The important birds include critically endangered (Falco peregrines) and endangered (Lophura leucomelanos) species. It was found that mammals were primarily killed due to HWC (60%) followed by recreation (20%) and trade (15%) Whereas, the foremost hunting reasons for birds are recreation (50%), food (25%) and trade (25%). The drastic hunting/killing of the species needs our immediate attention. This unwarranted killing must be stopped forthwith otherwise these animals become extinct.Keywords: Azad Jammu and Kashmir, anthropogenic pressures, endangered species, human-wildlife conflicts
Procedia PDF Downloads 163575 Born in Limbo, Living in Limbo and Probably Will Die in Limbo
Authors: Betty Chiyangwa
Abstract:
The subject of second-generation migrant youth is under-researched in the context of South Africa. Thus, their opinions and views have been marginalised in social science research. This paper addresses this gap by exploring the complexities of second-generation Mozambican migrant youth’s lived experiences in how they construct their identities and develop a sense of belonging in post-apartheid South Africa, specifically in Bushbuckridge. Bushbuckridge was among the earliest districts to accommodate Mozambican refugees to South Africa in the 1970s and remains associated with large numbers of Mozambicans. Drawing on Crenshaw’s (1989) intersectionality approach, the study contributes to knowledge on South-to-South migration by demonstrating how this approach is operationalised to understand the complex lived experiences of a disadvantaged group in life and possibly in death. In conceptualising the notion of identity among second-generation migrant youth, this paper explores the history and present of first and second-generation Mozambican migrants in South Africa to reveal how being born to migrant parents and raised in a hosting country poses life-long complications in one’s identity and sense of belonging. In the quest to form their identities and construct a sense of belonging, migrant youth employ precariously means to navigate the terrane. This is a case study informed by semi-structured interviews and narrative data gathered from 22 second-generation Mozambican migrant youth between 18 and 34 years who were born to at least one Mozambican parent living in Bushbuckridge and raised in South Africa. Views of two key informants from the South African Department of Home Affairs and the local tribal authority provided additional perspectives on second-generation migrant youth’s lived experiences in Bushbuckridge, which were explored thematically and narratively through Braun and Clarke’s (2012) six-step framework for analysing qualitative data. In exploring the interdependency and interconnectedness of social categories and social systems in Bushbuckridge, the findings revealed that participants’ experiences of identity formation and development of a sense of belonging were marginalised in complex, intersectional and precarious ways where they constantly (re)negotiated their daily experiences, which were largely shaped by their paradoxical migrant status in a host country. This study found that, in the quest for belonging, migrant youths were not a perfectly integrated category but evolved from almost daily lived experiences of creating a living that gave them an identity and a sense of belonging in South Africa. The majority of them shared feelings of living in limbo since childhood and fear of possibly dying in limbo with no clear (solid) sense of belonging to either South Africa or Mozambique. This study concludes that there is a strong association between feelings of identity, sense of belonging and levels of social integration. It recommends the development and adoption of a multilayer comprehensive model for understanding second-generation migrant youth identity and belonging in South Africa which encourages a collaborative effort among individual migrant youth, their family members, neighbours, society, and regional and national institutional structures for migrants to enhance and harness their capabilities and improve their wellbeing in South Africa.Keywords: bushbuckridge, limbo, mozambican migrants, second-generation
Procedia PDF Downloads 70574 An Exploration of Special Education Teachers’ Practices in a Preschool Intellectual Disability Centre in Saudi Arabia
Authors: Faris Algahtani
Abstract:
Background: In Saudi Arabia, it is essential to know what practices are employed and considered effective by special education teachers working with preschool children with intellectual disabilities, as a prerequisite for identifying areas for improvement. Preschool provision for these children is expanding through a network of Intellectual Disability Centres while, in primary schools, a policy of inclusion is pursued and, in mainstream preschools, pilots have been aimed at enhancing learning in readiness for primary schooling. This potentially widens the attainment gap between preschool children with and without intellectual disabilities, and influences the scope for improvement. Goal: The aim of the study was to explore special education teachers’ practices and perceived perceptions of those practices for preschool children with intellectual disabilities in Saudi Arabia Method: A qualitative interpretive approach was adopted in order to gain a detailed understanding of how special education teachers in an IDC operate in the classroom. Fifteen semi-structured interviews were conducted with experienced and qualified teachers. Data were analysed using thematic analysis, based on themes identified from the literature review together with new themes emerging from the data. Findings: American methods strongly influenced teaching practices, in particular TEACCH (Treatment and Education of Autistic and Communication related handicapped Children), which emphasises structure, schedules and specific methods of teaching tasks and skills; and ABA (Applied Behaviour Analysis), which aims to improve behaviours and skills by concentrating on detailed breakdown and teaching of task components and rewarding desired behaviours with positive reinforcement. The Islamic concept of education strongly influenced which teaching techniques were used and considered effective, and how they were applied. Tensions were identified between the Islamic approach to disability, which accepts differences between human beings as created by Allah in order for people to learn to help and love each other, and the continuing stigmatisation of disability in many Arabic cultures, which means that parents who bring their children to an IDC often hope and expect that their children will be ‘cured’. Teaching methods were geared to reducing behavioural problems and social deficits rather than to developing the potential of the individual child, with some teachers recognizing the child’s need for greater freedom. Relationships with parents could in many instances be improved. Teachers considered both initial teacher education and professional development to be inadequate for their needs and the needs of the children they teach. This can be partly attributed to the separation of training and development of special education teachers from that of general teachers. Conclusion: Based on the findings, teachers’ practices could be improved by the inclusion of general teaching strategies, parent-teacher relationships and practical teaching experience in both initial teacher education and professional development. Coaching and mentoring support from carefully chosen special education teachers could assist the process, as could the presence of a second teacher or teaching assistant in the classroom.Keywords: special education, intellectual disabilities, early intervention , early childhood
Procedia PDF Downloads 138573 Prospects of Milk Protein as a Potential Alternative of Natural Antibiotic
Authors: Syeda Fahria Hoque Mimmi
Abstract:
Many new and promising treatments for reducing or diminishing the adverse effects of microorganisms are being discovered day by day. On the other hand, the dairy industry is accelerating the economic wheel of Bangladesh. Considering all these facts, new thoughts were developed to isolate milk proteins by the present experiment for opening up a new era of developing natural antibiotics from milk. Lactoferrin, an iron-binding glycoprotein with multifunctional properties, is crucial to strengthening the immune system and also useful for commercial applications. The protein’s iron-binding capacity makes it undoubtedly advantageous to immune system modulation and different bacterial strains. For fulfilling the purpose, 4 of raw and 17 of commercially available milk samples were collected from different farms and stores in Bangladesh (Dhaka, Chittagong, and Cox’s Bazar). Protein quantification by nanodrop technology has confirmed that raw milk samples have better quantities of protein than the commercial ones. All the samples were tested for their antimicrobial activity against 18 pathogens, where raw milk samples showed a higher percentage of antibacterial activity. In addition to this, SDS-PAGE (Sodium Dodecyl Sulfate–Polyacrylamide Gel Electrophoresis) was performed to identify lactoferrin in the milk samples. Lactoferrin was detected in 9 samples from which 4 were raw milk samples. Interestingly, Streptococcus pyogenes, Klebsiella pneumoniae, Bacillus cereus, Pseudomonas aeruginosa, Vibrio cholera, Staphylococcus aureus, and enterotoxigenic E. coli significantly displayed sensitivity against lactoferrin collected from raw milk. Only Bacillus cereus, Pseudomonas aeruginosa, Streptococcus pneumonia, Enterococcus faecalis, and ETEC (Enterotoxigenic Escherichia coli) were susceptible to lactoferrin obtained from a commercial one. This study suggested that lactoferrin might be used as the potential alternative of antibiotics for many diseases and also can be used to reduce microbial deterioration in the food and feed industry.Keywords: alternative of antibiotics, commercially available milk, lactoferrin, nanodrop technology, pathogens, raw milk
Procedia PDF Downloads 180572 Field Study on Thermal Performance of a Green Office in Bangkok, Thailand: A Possibility of Increasing Temperature Set-Points
Authors: T. Sikram, M. Ichinose, R. Sasaki
Abstract:
In the tropics, indoor thermal environment is usually provided by a cooling mode to maintain comfort all year. Indoor thermal environment performance is sometimes different from the standard or from the first design process because of operation, maintenance, and utilization. The field study of thermal environment in the green building is still limited in this region, while the green building continues to increase. This study aims to clarify thermal performance and subjective perception in the green building by testing the temperature set-points. A Thai green office was investigated twice in October 2018 and in May 2019. Indoor environment variables (temperature, relative humidity, and wind velocity) were collected continuously. The temperature set-point was normally set as 23 °C, and it was changed into 24 °C and 25 °C. The study found that this gap of temperature set-point produced average room temperature from 22.7 to 24.6 °C and average relative humidity from 55% to 62%. Thermal environments slight shifted out of the ASHRAE comfort zone when the set-point was increased. Based on the thermal sensation vote, the feeling-colder vote decreased by 30% and 18% when changing +1 °C and +2 °C, respectively. Predicted mean vote (PMV) shows that most of the calculated median values were negative. The values went close to the optimal neutral value (0) when the set-point was set at 25 °C. The neutral temperature was slightly decreased when changing warmer temperature set-points. Building-related symptom reports were found in this study that the number of votes reduced continuously when the temperature was warmer. The symptoms that occurred by a cooler condition had the number of votes more than ones that occurred by a warmer condition. In sum, for this green office, there is a possibility to adjust a higher temperature set-point to +1 °C (24 °C) in terms of reducing cold sensitivity, discomfort, and symptoms. All results could support the policy of changing a warmer temperature of this office to become “a better green building”.Keywords: thermal environment, green office, temperature set-point, comfort
Procedia PDF Downloads 119