Search results for: inherent feature
892 Kinetic Model to Interpret Whistler Waves in Multicomponent Non-Maxwellian Space Plasmas
Authors: Warda Nasir, M. N. S. Qureshi
Abstract:
Whistler waves are right handed circularly polarized waves and are frequently observed in space plasmas. The Low frequency branch of the Whistler waves having frequencies nearly around 100 Hz, known as Lion roars, are frequently observed in magnetosheath. Another feature of the magnetosheath is the observations of flat top electron distributions with single as well as two electron populations. In the past, lion roars were studied by employing kinetic model using classical bi-Maxwellian distribution function, however, could not be justified both on quantitatively as well as qualitatively grounds. We studied Whistler waves by employing kinetic model using non-Maxwellian distribution function such as the generalized (r,q) distribution function which is the generalized form of kappa and Maxwellian distribution functions by employing kinetic theory with single or two electron populations. We compare our results with the Cluster observations and found good quantitative and qualitative agreement between them. At times when lion roars are observed (not observed) in the data and bi-Maxwellian could not provide the sufficient growth (damping) rates, we showed that when generalized (r,q) distribution function is employed, the resulted growth (damping) rates exactly match the observations.Keywords: kinetic model, whistler waves, non-maxwellian distribution function, space plasmas
Procedia PDF Downloads 314891 Multivariate Output-Associative RVM for Multi-Dimensional Affect Predictions
Authors: Achut Manandhar, Kenneth D. Morton, Peter A. Torrione, Leslie M. Collins
Abstract:
The current trends in affect recognition research are to consider continuous observations from spontaneous natural interactions in people using multiple feature modalities, and to represent affect in terms of continuous dimensions, incorporate spatio-temporal correlation among affect dimensions, and provide fast affect predictions. These research efforts have been propelled by a growing effort to develop affect recognition system that can be implemented to enable seamless real-time human-computer interaction in a wide variety of applications. Motivated by these desired attributes of an affect recognition system, in this work a multi-dimensional affect prediction approach is proposed by integrating multivariate Relevance Vector Machine (MVRVM) with a recently developed Output-associative Relevance Vector Machine (OARVM) approach. The resulting approach can provide fast continuous affect predictions by jointly modeling the multiple affect dimensions and their correlations. Experiments on the RECOLA database show that the proposed approach performs competitively with the OARVM while providing faster predictions during testing.Keywords: dimensional affect prediction, output-associative RVM, multivariate regression, fast testing
Procedia PDF Downloads 286890 Predicting Open Chromatin Regions in Cell-Free DNA Whole Genome Sequencing Data by Correlation Clustering
Authors: Fahimeh Palizban, Farshad Noravesh, Amir Hossein Saeidian, Mahya Mehrmohamadi
Abstract:
In the recent decade, the emergence of liquid biopsy has significantly improved cancer monitoring and detection. Dying cells, including those originating from tumors, shed their DNA into the blood and contribute to a pool of circulating fragments called cell-free DNA. Accordingly, identifying the tissue origin of these DNA fragments from the plasma can result in more accurate and fast disease diagnosis and precise treatment protocols. Open chromatin regions are important epigenetic features of DNA that reflect cell types of origin. Profiling these features by DNase-seq, ATAC-seq, and histone ChIP-seq provides insights into tissue-specific and disease-specific regulatory mechanisms. There have been several studies in the area of cancer liquid biopsy that integrate distinct genomic and epigenomic features for early cancer detection along with tissue of origin detection. However, multimodal analysis requires several types of experiments to cover the genomic and epigenomic aspects of a single sample, which will lead to a huge amount of cost and time. To overcome these limitations, the idea of predicting OCRs from WGS is of particular importance. In this regard, we proposed a computational approach to target the prediction of open chromatin regions as an important epigenetic feature from cell-free DNA whole genome sequence data. To fulfill this objective, local sequencing depth will be fed to our proposed algorithm and the prediction of the most probable open chromatin regions from whole genome sequencing data can be carried out. Our method integrates the signal processing method with sequencing depth data and includes count normalization, Discrete Fourie Transform conversion, graph construction, graph cut optimization by linear programming, and clustering. To validate the proposed method, we compared the output of the clustering (open chromatin region+, open chromatin region-) with previously validated open chromatin regions related to human blood samples of the ATAC-DB database. The percentage of overlap between predicted open chromatin regions and the experimentally validated regions obtained by ATAC-seq in ATAC-DB is greater than 67%, which indicates meaningful prediction. As it is evident, OCRs are mostly located in the transcription start sites (TSS) of the genes. In this regard, we compared the concordance between the predicted OCRs and the human genes TSS regions obtained from refTSS and it showed proper accordance around 52.04% and ~78% with all and the housekeeping genes, respectively. Accurately detecting open chromatin regions from plasma cell-free DNA-seq data is a very challenging computational problem due to the existence of several confounding factors, such as technical and biological variations. Although this approach is in its infancy, there has already been an attempt to apply it, which leads to a tool named OCRDetector with some restrictions like the need for highly depth cfDNA WGS data, prior information about OCRs distribution, and considering multiple features. However, we implemented a graph signal clustering based on a single depth feature in an unsupervised learning manner that resulted in faster performance and decent accuracy. Overall, we tried to investigate the epigenomic pattern of a cell-free DNA sample from a new computational perspective that can be used along with other tools to investigate genetic and epigenetic aspects of a single whole genome sequencing data for efficient liquid biopsy-related analysis.Keywords: open chromatin regions, cancer, cell-free DNA, epigenomics, graph signal processing, correlation clustering
Procedia PDF Downloads 150889 The Ludic Exception and the Permanent Emergency: Understanding the Emergency Regimes with the Concept of Play
Authors: Mete Ulaş Aksoy
Abstract:
In contemporary politics, the state of emergency has become a permanent and salient feature of politics. This study aims to clarify the anthropological and ontological dimensions of the permanent state of emergency. It pays special attention to the structural relation between the exception and play. Focusing on the play in the context of emergency and exception enables the recognition of the difference and sometimes the discrepancy between the exception and emergency, which has passed into oblivion because of the frequency and normalization of emergency situations. This study coins the term “ludic exception” in order to highlight the difference between the exceptions in which exuberance and paroxysm rule over the socio-political life and the permanent emergency that protects the authority with a sort of extra-legality. The main thesis of the study is that the ludic elements such as risk, conspicuous consumption, sacrificial gestures, agonism, etc. circumscribe the exceptional moments temporarily, preventing them from being routine and normal. The study also emphasizes the decline of ludic elements in modernity as the main factor in the transformation of the exceptions into permanent emergency situations. In the introduction, the relationship between play and exception is taken into consideration. In the second part, the study elucidates the concept of ludic exceptions and dwells on the anthropological examples of the ludic exceptions. In the last part, the decline of ludic elements in modernity is addressed as the main factor for the permanent emergency.Keywords: emergency, exception, ludic exception, play, sovereignty
Procedia PDF Downloads 89888 Study of Error Analysis and Sources of Uncertainty in the Measurement of Residual Stresses by the X-Ray Diffraction
Authors: E. T. Carvalho Filho, J. T. N. Medeiros, L. G. Martinez
Abstract:
Residual stresses are self equilibrating in a rigid body that acts on the microstructure of the material without application of an external load. They are elastic stresses and can be induced by mechanical, thermal and chemical processes causing a deformation gradient in the crystal lattice favoring premature failure in mechanicals components. The search for measurements with good reliability has been of great importance for the manufacturing industries. Several methods are able to quantify these stresses according to physical principles and the response of the mechanical behavior of the material. The diffraction X-ray technique is one of the most sensitive techniques for small variations of the crystalline lattice since the X-ray beam interacts with the interplanar distance. Being very sensitive technique is also susceptible to variations in measurements requiring a study of the factors that influence the final result of the measurement. Instrumental, operational factors, form deviations of the samples and geometry of analyzes are some variables that need to be considered and analyzed in order for the true measurement. The aim of this work is to analyze the sources of errors inherent to the residual stress measurement process by X-ray diffraction technique making an interlaboratory comparison to verify the reproducibility of the measurements. In this work, two specimens were machined, differing from each other by the surface finishing: grinding and polishing. Additionally, iron powder with particle size less than 45 µm was selected in order to be a reference (as recommended by ASTM E915 standard) for the tests. To verify the deviations caused by the equipment, those specimens were positioned and with the same analysis condition, seven measurements were carried out at 11Ψ tilts. To verify sample positioning errors, seven measurements were performed by positioning the sample at each measurement. To check geometry errors, measurements were repeated for the geometry and Bragg Brentano parallel beams. In order to verify the reproducibility of the method, the measurements were performed in two different laboratories and equipments. The results were statistically worked out and the quantification of the errors.Keywords: residual stress, x-ray diffraction, repeatability, reproducibility, error analysis
Procedia PDF Downloads 181887 Towards Human-Interpretable, Automated Learning of Feedback Control for the Mixing Layer
Authors: Hao Li, Guy Y. Cornejo Maceda, Yiqing Li, Jianguo Tan, Marek Morzynski, Bernd R. Noack
Abstract:
We propose an automated analysis of the flow control behaviour from an ensemble of control laws and associated time-resolved flow snapshots. The input may be the rich database of machine learning control (MLC) optimizing a feedback law for a cost function in the plant. The proposed methodology provides (1) insights into the control landscape, which maps control laws to performance, including extrema and ridge-lines, (2) a catalogue of representative flow states and their contribution to cost function for investigated control laws and (3) visualization of the dynamics. Key enablers are classification and feature extraction methods of machine learning. The analysis is successfully applied to the stabilization of a mixing layer with sensor-based feedback driving an upstream actuator. The fluctuation energy is reduced by 26%. The control replaces unforced Kelvin-Helmholtz vortices with subsequent vortex pairing by higher-frequency Kelvin-Helmholtz structures of lower energy. These efforts target a human interpretable, fully automated analysis of MLC identifying qualitatively different actuation regimes, distilling corresponding coherent structures, and developing a digital twin of the plant.Keywords: machine learning control, mixing layer, feedback control, model-free control
Procedia PDF Downloads 223886 Concubines, Handmaids Or Sister Wives: Polygamy In The Media, A Comparison Between The TV Dramas "The Legend of Zhen Huan", "The Handmaid’s Tale" And "Big Love"
Authors: Muriel Canas-Walker
Abstract:
Polygamy is a sensitive issue yet a surprisingly popular topic on television. In China, among other palace intrigues dramas, "The Legend of Zhen Huan" stands out in its harsh portrayal of sequestered concubines in the Forbidden City. In the United States the critically acclaimed "Big Love", set in the Mormon community, generated much discussion and controversy, both accademically and on social media. More recently "The Handmaid’s Tale", adapted from the famous novel by Canadian writer Margaret Atwood, also contributed to the topic. All three dramas feature the plight of women caught in a polygamy system and are particularly popular with female audiences. Using Foucault’s theory of power, visual anthropology, and feminist perspective this paper aims at analyzing the treatment of this sensitive topic in the media and its reception. From the seemingly happy sister wives in "Big Love", to the fiercely competitive concubines in "The Legend of Zhen Huan" and the tragically coerced handmaids in "The Handmaid’s Tale", the lives of women in a polygamy system are inspiring to modern audiences. This paper’s objective is to understand how the treatment of polygamy is relevant to these audiences.Keywords: polygamy, michel foucault, feminism, visual anthropology
Procedia PDF Downloads 91885 Exploratory Analysis and Development of Sustainable Lean Six Sigma Methodologies Integration for Effective Operation and Risk Mitigation in Manufacturing Sectors
Authors: Chukwumeka Daniel Ezeliora
Abstract:
The Nigerian manufacturing sector plays a pivotal role in the country's economic growth and development. However, it faces numerous challenges, including operational inefficiencies and inherent risks that hinder its sustainable growth. This research aims to address these challenges by exploring the integration of Lean and Six Sigma methodologies into the manufacturing processes, ultimately enhancing operational effectiveness and risk mitigation. The core of this research involves the development of a sustainable Lean Six Sigma framework tailored to the specific needs and challenges of Nigeria's manufacturing environment. This framework aims to streamline processes, reduce waste, improve product quality, and enhance overall operational efficiency. It incorporates principles of sustainability to ensure that the proposed methodologies align with environmental and social responsibility goals. To validate the effectiveness of the integrated Lean Six Sigma approach, case studies and real-world applications within select manufacturing companies in Nigeria will be conducted. Data were collected to measure the impact of the integration on key performance indicators, such as production efficiency, defect reduction, and risk mitigation. The findings from this research provide valuable insights and practical recommendations for selected manufacturing companies in South East Nigeria. By adopting sustainable Lean Six Sigma methodologies, these organizations can optimize their operations, reduce operational risks, improve product quality, and enhance their competitiveness in the global market. In conclusion, this research aims to bridge the gap between theory and practice by developing a comprehensive framework for the integration of Lean and Six Sigma methodologies in Nigeria's manufacturing sector. This integration is envisioned to contribute significantly to the sector's sustainable growth, improved operational efficiency, and effective risk mitigation strategies, ultimately benefiting the Nigerian economy as a whole.Keywords: lean six sigma, manufacturing, risk mitigation, sustainability, operational efficiency
Procedia PDF Downloads 207884 Ecocentric Principles for the Change of the Anthropocentric Design Within the Other Species Related Fields
Authors: Armando Cuspinera
Abstract:
Humans are nature itself, being with non-human species part of the same ecosystem, but the praxis reflects that not all relations are the same. In fields of design such as Biomimicry, Biodesign, and Biophilic design exist different approaches towards nature, nevertheless, anthropocentric principles such as domination, objectivization, or exploitation are defined in the same as ecocentric principles of inherent importance in life itself. Anthropocentrism has showed humanity with pollution of the earth, water, air, and the destruction of whole ecosystems from monocultures and rampant production of useless objects that life cannot outstand this unaware rhythm of life focused only for the human benefits. Even if by nature the biosphere is resilient, studies showed in the Paris Agreement explain that humanity will perish in an unconscious way of praxis. This is why is important to develop a differentiation between anthropocentric and ecocentricprinciples in the praxis of design, in order to enhance respect, valorization, and positive affectivity towards other life forms is necessary to analyze what principles are reproduced from each practice of design. It is only from the study of immaterial dimensions of design such as symbolism, epistemology, and ontology that the relation towards nature can be redesigned, and in order to do so, it must be studies from the dimensions of ontological design what principles –anthropocentric or ecocentric- through what the objects enhance or focus the perception humans have to its surrounding. The things we design also design us is the principle of ontological design, and in order to develop a way of ecological design in which is possible to consider other species as users, designers or collaborators is important to extend the studies and relation to other living forms from a transdisciplinary perspective of techniques, knowledge, practice, and disciplines in general. Materials, technologies, and any kind of knowledge have the principle of a tool: is not good nor bad, but is in the way of using it the possibilities that exist within them. The collaboration of disciplines and fields of study gives the opportunity to connect principles from other cultures such as Deep Ecology and Environmental Humanities in the development of methodologies of design that study nature, integrates their strategies to our own species, and considers life of other species as important as human life, and is only form the studies of ontological design that material and immaterial dimensions can be analyzed and imbued with structures that already exist in other fields.Keywords: design, antropocentrism, ecocentrism, ontological design
Procedia PDF Downloads 156883 A Genre-Based Approach to the Teaching of Pronunciation
Authors: Marden Silva, Danielle Guerra
Abstract:
Some studies have indicated that pronunciation teaching hasn’t been paid enough attention by teachers regarding EFL contexts. In particular, segmental and suprasegmental features through genre-based approach may be an opportunity on how to integrate pronunciation into a more meaningful learning practice. Therefore, the aim of this project was to carry out a survey on some aspects related to English pronunciation that Brazilian students consider more difficult to learn, thus enabling the discussion of strategies that can facilitate the development of oral skills in English classes by integrating the teaching of phonetic-phonological aspects into the genre-based approach. Notions of intelligibility, fluency and accuracy were proposed by some authors as an ideal didactic sequence. According to their proposals, basic learners should be exposed to activities focused on the notion of intelligibility as well as intermediate students to the notion of fluency, and finally more advanced ones to accuracy practices. In order to test this hypothesis, data collection was conducted during three high school English classes at Federal Center for Technological Education of Minas Gerais (CEFET-MG), in Brazil, through questionnaires and didactic activities, which were recorded and transcribed for further analysis. The genre debate was chosen to facilitate the oral expression of the participants in a freer way, making them answering questions and giving their opinion about a previously selected topic. The findings indicated that basic students demonstrated more difficulty with aspects of English pronunciation than the others. Many of the intelligibility aspects analyzed had to be listened more than once for a better understanding. For intermediate students, the speeches recorded were considerably easier to understand, but nevertheless they found it more difficult to pronounce the words fluently, often interrupting their speech to think about what they were going to say and how they would talk. Lastly, more advanced learners seemed to express their ideas more fluently, but still subtle errors related to accuracy were perceptible in speech, thereby confirming the proposed hypothesis. It was also seen that using genre-based approach to promote oral communication in English classes might be a relevant method, considering the socio-communicative function inherent in the suggested approach.Keywords: EFL, genre-based approach, oral skills, pronunciation
Procedia PDF Downloads 130882 Monitoring Energy Reduction through Applying Green Roofs to Residential Buildings in Dubai
Authors: Hanan M. Taleb
Abstract:
Since buildings are a major consumer of energy, their potential impact on the environment is considerable. Therefore, expanding the application of low energy architecture is of the utmost importance. Designing with nature is also one of the most attractive methods of design for many architects and designers because it creates a pathway to sustainability. One feature of designing with nature is the use of green roofing which aims to cover the roof with vegetation either partially or completely. Appreciably, green roofing in a building has many advantages including absorbing rainwater, providing thermal insulation, enhancing the ecology, creating a peaceful retreat for people and animals, improving air quality and helping to offset the air temperature and heat island effect. The aim of this paper is to monitor energy saving in the residential buildings of Dubai after applying green roofing techniques. The paper also attempts to provide a thermal analysis after the application of green roofs. A villa in Dubai was chosen as a case study. With the aid of energy simulation software, namely Design Builder, as well as manual recording and calculations, the energy savings after applying the green roofing were detected. To that extent, the paper draws some recommendations with regard to the types of green roofing that should be used in these particular climatic conditions based on this real experiment that took place over a one year period.Keywords: residential buildings, Dubai, energy saving, green roofing, CFD, thermal comfort
Procedia PDF Downloads 299881 Detecting and Thwarting Interest Flooding Attack in Information Centric Network
Authors: Vimala Rani P, Narasimha Malikarjunan, Mercy Shalinie S
Abstract:
Data Networking was brought forth as an instantiation of information-centric networking. The attackers can send a colossal number of spoofs to take hold of the Pending Interest Table (PIT) named an Interest Flooding attack (IFA) since the in- interests are recorded in the PITs of the intermediate routers until they receive corresponding Data Packets are go beyond the time limit. These attacks can be detrimental to network performance. PIT expiration rate or the Interest satisfaction rate, which cannot differentiate the IFA from attacks, is the criterion Traditional IFA detection techniques are concerned with. Threshold values can casually affect Threshold-based traditional methods. This article proposes an accurate IFA detection mechanism based on a Multiple Feature-based Extreme Learning Machine (MF-ELM). Accuracy of the attack detection can be increased by presenting the entropy of Internet names, Interest satisfaction rate and PIT usage as features extracted in the MF-ELM classifier. Furthermore, we deploy a queue-based hostile Interest prefix mitigation mechanism. The inference of this real-time test bed is that the mechanism can help the network to resist IFA with higher accuracy and efficiency.Keywords: information-centric network, pending interest table, interest flooding attack, MF-ELM classifier, queue-based mitigation strategy
Procedia PDF Downloads 205880 How Unicode Glyphs Revolutionized the Way We Communicate
Authors: Levi Corallo
Abstract:
Typed language made by humans on computers and cell phones has made a significant distinction from previous modes of written language exchanges. While acronyms remain one of the most predominant markings of typed language, another and perhaps more recent revolution in the way humans communicate has been with the use of symbols or glyphs, primarily Emojis—globally introduced on the iPhone keyboard by Apple in 2008. This paper seeks to analyze the use of symbols in typed communication from both a linguistic and machine learning perspective. The Unicode system will be explored and methods of encoding will be juxtaposed with the current machine and human perception. Topics in how typed symbol usage exists in conversation will be explored as well as topics across current research methods dealing with Emojis like sentiment analysis, predictive text models, and so on. This study proposes that sequential analysis is a significant feature for analyzing unicode characters in a corpus with machine learning. Current models that are trying to learn or translate the meaning of Emojis should be starting to learn using bi- and tri-grams of Emoji, as well as observing the relationship between combinations of different Emoji in tandem. The sociolinguistics of an entire new vernacular of language referred to here as ‘typed language’ will also be delineated across my analysis with unicode glyphs from both a semantic and technical perspective.Keywords: unicode, text symbols, emojis, glyphs, communication
Procedia PDF Downloads 194879 Effects of Nitroxin Fertilizer on Physiological Characters Forage Millet under Drought Stress Conditions
Authors: Mohammad Darbani, Jafar Masoud Sinaki, Armaghan Abedzadeh Neyshaburi
Abstract:
An experiment was conducted as split plot factorial design using randomized complete block design in Damghan in 2012-2013 in order to investigate the effects of irrigation cut off (based on the Phenological stages of plants) on physiological properties of forage millet cultivars. The treatments included three irrigation levels (control with full irrigation, irrigation cut off when flowering started, and irrigation cut off when flowering ended) in the main plots, and applying nitroxin biofertilizer (+), not applying nitroxin biofertilizer (control), and Iranian forage millet cultivars (Bastan, Pishahang, and Isfahan) in the subplots. The highest rate of ashes and water-soluble carbohydrates content were observed in the cultivar Bastan (8.22 and 8.91%, respectively), the highest content of fiber and water (74.17 and 48.83%, respectively) in the treatment of irrigation cut off when flowering started, and the largest proline concentration (μmol/gfw-1) was seen in the treatment of irrigation cut off when flowering started. very rapid growth of millet, its short growing season, drought tolerance, its unique feature regarding harvest time, and its response to nitroxin biofertilizer can help expanding its cultivation in arid and semi-arid regions of Iran.Keywords: irrigation cut off, forage millet, Nitroxin fertilizer, physiological properties
Procedia PDF Downloads 609878 MhAGCN: Multi-Head Attention Graph Convolutional Network for Web Services Classification
Authors: Bing Li, Zhi Li, Yilong Yang
Abstract:
Web classification can promote the quality of service discovery and management in the service repository. It is widely used to locate developers desired services. Although traditional classification methods based on supervised learning models can achieve classification tasks, developers need to manually mark web services, and the quality of these tags may not be enough to establish an accurate classifier for service classification. With the doubling of the number of web services, the manual tagging method has become unrealistic. In recent years, the attention mechanism has made remarkable progress in the field of deep learning, and its huge potential has been fully demonstrated in various fields. This paper designs a multi-head attention graph convolutional network (MHAGCN) service classification method, which can assign different weights to the neighborhood nodes without complicated matrix operations or relying on understanding the entire graph structure. The framework combines the advantages of the attention mechanism and graph convolutional neural network. It can classify web services through automatic feature extraction. The comprehensive experimental results on a real dataset not only show the superior performance of the proposed model over the existing models but also demonstrate its potentially good interpretability for graph analysis.Keywords: attention mechanism, graph convolutional network, interpretability, service classification, service discovery
Procedia PDF Downloads 135877 Spatio-Temporal Data Mining with Association Rules for Lake Van
Authors: Tolga Aydin, M. Fatih Alaeddinoğlu
Abstract:
People, throughout the history, have made estimates and inferences about the future by using their past experiences. Developing information technologies and the improvements in the database management systems make it possible to extract useful information from knowledge in hand for the strategic decisions. Therefore, different methods have been developed. Data mining by association rules learning is one of such methods. Apriori algorithm, one of the well-known association rules learning algorithms, is not commonly used in spatio-temporal data sets. However, it is possible to embed time and space features into the data sets and make Apriori algorithm a suitable data mining technique for learning spatio-temporal association rules. Lake Van, the largest lake of Turkey, is a closed basin. This feature causes the volume of the lake to increase or decrease as a result of change in water amount it holds. In this study, evaporation, humidity, lake altitude, amount of rainfall and temperature parameters recorded in Lake Van region throughout the years are used by the Apriori algorithm and a spatio-temporal data mining application is developed to identify overflows and newly-formed soil regions (underflows) occurring in the coastal parts of Lake Van. Identifying possible reasons of overflows and underflows may be used to alert the experts to take precautions and make the necessary investments.Keywords: apriori algorithm, association rules, data mining, spatio-temporal data
Procedia PDF Downloads 374876 Characterization and Effect of Using Pumpkin Seeds Oil Methyl Ester (PSME) as Fuel in a LHR Diesel Engine
Authors: Hanbey Hazar, Hakan Gul, Ugur Ozturk
Abstract:
In order to decrease the hazardous emissions of the internal combustion engines and to improve the combustion and thermal efficiency, thermal barrier coatings are applied. In this experimental study, cylinder, piston, exhaust, and inlet valves which are combustion chamber components have been coated with a ceramic material, and this earned the engine LHR feature. Cylinder, exhaust and inlet valves of the diesel engine used in the tests were coated with ekabor-2 commercial powder, which is a ceramic material, to a thickness of 50 µm, by using the boriding method. The piston of a diesel engine was coated in 300 µm thickness with bor-based powder by using plasma coating method. Pumpkin seeds oil methyl ester (PSME) was produced by the transesterification method. In addition, dimethoxymethane additive materials were used to improve the properties of diesel fuel, pumpkin seeds oil methyl ester (PSME) and its mixture. Dimethoxymethane was blended with test fuels, which was used as a pilot fuel, at the volumetric ratios of 4% and 8%. Due to thermal barrier coating, the diesel engine's CO, HC, and smoke density values decreased; but, NOx and exhaust gas temperature (EGT) increased.Keywords: boriding, diesel engine, exhaust emission, thermal barrier coating
Procedia PDF Downloads 477875 Clinical Profile of Renal Diseases in Children in Tertiary Care Centre
Authors: Jyoti Agrawal
Abstract:
Introduction: Renal diseases in children and young adult can be difficult to diagnose early as it may present only with few symptoms, tends to have different course than adult and respond variously to different treatment. The pattern of renal disease in children is different from developing countries as compared to developed countries. Methods: This study was a hospital based prospective observational study carried from March, 2014 to February 2015 at BP Koirala institute of health sciences. Patients with renal disease, both inpatient and outpatient from birth to 14 years of age were enrolled in the study. The diagnosis of renal disease was be made on clinical and laboratory criteria. Results: Total of 120 patients were enrolled in our study which contributed to 3.74% % of total admission. The commonest feature of presentation was edema (75%), followed by fever (65%), hypertension (60%), decreased urine output (45%) and hematuria (25%). Most common diagnosis was acute glomerulonephritis (40%) followed by Nephrotic syndrome (25%) and urinary tract infection (25%). Renal biopsy was done for 10% of cases and most of them were steroid dependent nephrotic syndrome. 5% of our cases expired because of multiorgan dysfunction syndrome, sepsis and acute kidney injury. Conclusion: Renal disease contributes to a large part of hospital pediatric admission as well as mortality and morbidity to the children.Keywords: glomerulonephritis, nephrotic syndrome, renal disease, urinary tract infection
Procedia PDF Downloads 426874 Robust Recognition of Locomotion Patterns via Data-Driven Machine Learning in the Cloud Environment
Authors: Shinoy Vengaramkode Bhaskaran, Kaushik Sathupadi, Sandesh Achar
Abstract:
Human locomotion recognition is important in a variety of sectors, such as robotics, security, healthcare, fitness tracking and cloud computing. With the increasing pervasiveness of peripheral devices, particularly Inertial Measurement Units (IMUs) sensors, researchers have attempted to exploit these advancements in order to precisely and efficiently identify and categorize human activities. This research paper introduces a state-of-the-art methodology for the recognition of human locomotion patterns in a cloud environment. The methodology is based on a publicly available benchmark dataset. The investigation implements a denoising and windowing strategy to deal with the unprocessed data. Next, feature extraction is adopted to abstract the main cues from the data. The SelectKBest strategy is used to abstract optimal features from the data. Furthermore, state-of-the-art ML classifiers are used to evaluate the performance of the system, including logistic regression, random forest, gradient boosting and SVM have been investigated to accomplish precise locomotion classification. Finally, a detailed comparative analysis of results is presented to reveal the performance of recognition models.Keywords: artificial intelligence, cloud computing, IoT, human locomotion, gradient boosting, random forest, neural networks, body-worn sensors
Procedia PDF Downloads 11873 Introducing α-Oxoester (COBz) as a Protecting Group for Carbohydrates
Authors: Atul Kumar, Veeranjaneyulu Gannedi, Qazi Naveed Ahmed
Abstract:
Oligosaccharides, which are essential to all cellular organisms, play vital roles in cell recognition, signaling, and are involved in a broad range of biological processes. The chemical synthesis of carbohydrates represents a powerful tool to provide homogeneous glycans. In carbohydrate synthesis, the major concern is the orthogonal protection of hydroxyl groups that can be unmasked independently. Classical protecting groups include benzyl ethers (Bn), which are normally cleaved through hydrogenolysis or by means of metal reduction, and acetate (Ac), benzoate (Bz) or pivaloate esters, which are removed using base promoted hydrolysis. In present work a series of α-Oxoester (COBz) protected saccharides, with divergent base sensitivity profiles against benzoyl (Bz) and acetyl (Ac), were designed and KHSO₅/CH₃COCl in methanol was identified as an easy, mild, selective and efficient deprotecting reagent for their removal in the perspective of carbohydrate synthesis. Timely monitoring of later reagent was advantageous in establishing both sequential as well as simultaneous deprotecting of COBz, Bz, and Ac. The salient feature of our work is its ease to generate different acceptors using designed monosaccharides. In summary, we demonstrated α-Oxoester (COBz) as a new protecting group for carbohydrates and the application of this group for the synthesis of Glycosylphosphatidylinositol (GPI) anchor are in progress.Keywords: α-Oxoester, oligosaccharides, new protecting group, acceptor synthesis, glycosylation
Procedia PDF Downloads 150872 Improving the Uniformity of Electrostatic Meter’s Spatial Sensitivity
Authors: Mohamed Abdalla, Ruixue Cheng, Jianyong Zhang
Abstract:
In pneumatic conveying, the solids are mixed with air or gas. In industries such as coal fired power stations, blast furnaces for iron making, cement and flour processing, the mass flow rate of solids needs to be monitored or controlled. However the current gas-solids two-phase flow measurement techniques are not as accurate as the flow meters available for the single phase flow. One of the problems that the multi-phase flow meters to face is that the flow profiles vary with measurement locations and conditions of pipe routing, bends, elbows and other restriction devices in conveying system as well as conveying velocity and concentration. To measure solids flow rate or concentration with non-even distribution of solids in gas, a uniform spatial sensitivity is required for a multi-phase flow meter. However, there are not many meters inherently have such property. The circular electrostatic meter is a popular choice for gas-solids flow measurement with its high sensitivity to flow, robust construction, low cost for installation and non-intrusive nature. However such meters have the inherent non-uniform spatial sensitivity. This paper first analyses the spatial sensitivity of circular electrostatic meter in general and then by combining the effect of the sensitivity to a single particle and the sensing volume for a given electrode geometry, the paper reveals first time how a circular electrostatic meter responds to a roping flow stream, which is much more complex than what is believed at present. The paper will provide the recent research findings on spatial sensitivity investigation at the University of Tees side based on Finite element analysis using Ansys Fluent software, including time and frequency domain characteristics and the effect of electrode geometry. The simulation results will be compared tothe experimental results obtained on a large scale (14” diameter) rig. The purpose of this research is paving a way to achieve a uniform spatial sensitivity for the circular electrostatic sensor by mean of compensation so as to improve overall accuracy of gas-solids flow measurement.Keywords: spatial sensitivity, electrostatic sensor, pneumatic conveying, Ansys Fluent software
Procedia PDF Downloads 367871 Praxis-Oriented Pedagogies for Pre-Service Teachers: Teaching About and For Social Justice Through Equity Literature Circles
Authors: Joanne Robertson, Awneet Sivia
Abstract:
Preparing aspiring teachers to become advocates for social justice reflects a fundamental commitment for teacher education programs in Canada to create systemic educational change. The goal is ultimately to address inequities in K-12 education for students from multiple identity groups that have historically been marginalized and oppressed in schools. Social justice is described as an often undertheorized and vague concept in the literature, which increases the risk that teaching for social justice remains a lofty goal. Another concern is that the social justice agenda in teacher education in North America ignores pedagogies related to subject-matter knowledge and discipline-based teaching methods. The question surrounding how teacher education programs can address these issues forms the basis for the research undertaken in this study. The paper focuses on a qualitative research project that examines how an Equity Literature Circles (ELC) framework within a language arts methods course in a Bachelor of Education program may help pre-service teachers better understand the inherent relationship between literacy instructional practices and teaching about and for social justice. Grounded in the Freireian (2018) principle of praxis, this study specifically seeks to understand the impact of Equity Literature Circles on pre-service teachers’ understanding of current social justice issues (reflection), their development of professional competencies in literacy instruction (practice), and their identity as advocates of social justice (action) who address issues related to student diversity, equity, and human rights within the English Language Arts program. In this paper presentation, participants will be provided with an overview of the Equity Literature Circle framework, a summary of key findings and recommendations from the qualitative study, an annotated bibliography of suggested Young Adult novels, and opportunities for questions and dialogue.Keywords: literacy, language, equity, social justice, diversity, human rights
Procedia PDF Downloads 69870 Functional Characterization of Transcriptional Regulator WhiB Proteins of Mycobacterium Tuberculosis
Authors: Sonam Kumari
Abstract:
Mycobacterium tuberculosis (Mtb), the causative agent of tuberculosis, possesses a remarkable feature of entering into and emerging from a persistent state. The mechanism by which Mtb switches from the dormant state to the replicative form is still poorly characterized. Proteome studies have given us an insight into the role of certain proteins in giving stupendous virulence to Mtb, but numerous dotsremain unconnected and unaccounted. The WhiB family of proteins is one such protein that is associated with developmental processes in actinomycetes.Mtb has seven such proteins (WhiB1 to WhiB7).WhiB proteins are transcriptional regulators; their conserved C-terminal HTH motif is involved in DNA binding. They regulate various essential genes of Mtbby binding to their promoter DNA. Biophysical Analysis of the effect of DNA binding on WhiB proteins has not yet been appropriately characterized. Interaction with DNA induces conformational changes in the WhiB proteins, confirmed by steady-state fluorescence and circular dichroism spectroscopy. ITC has deduced thermodynamic parameters and the binding affinity of the interaction. Since these transcription factors are highly unstable in vitro, their stability and solubility were enhanced by the co-expression of molecular chaperones. The present study findings help determine the conditions under which the WhiB proteins interact with their interacting partner and the factors that influence their binding affinity. This is crucial in understanding their role in regulating gene expression in Mtbandin targeting WhiB proteins as a drug target to cure TB.Keywords: tuberculosis, WhiB proteins, mycobacterium tuberculosis, nucleic acid binding
Procedia PDF Downloads 104869 The Material-Process Perspective: Design and Engineering
Authors: Lars Andersen
Abstract:
The development of design and engineering in large construction projects are characterized by an increased degree of flattening out of formal structures, extended use of parallel and integrated processes (‘Integrated Concurrent Engineering’) and an increased number of expert disciplines. The integration process is based on ongoing collaborations, dialogues, intercommunication and comments on each other’s work (iterations). This process based on reciprocal communication between actors and disciplines triggers value creation. However, communication between equals is not in itself sufficient to create effective decision making. The complexity of the process and time pressure contribute to an increased risk of a deficit of decisions and loss of process control. The paper refers to a study that aims at developing a resilient decision-making system that does not come in conflict with communication processes based on equality between the disciplines in the process. The study includes the construction of a hospital, following the phases design, engineering and physical building. The Research method is a combination of formative process research, process tracking and phenomenological analyses. The study tracked challenges and problems in the building process to the projection substrates (drawing and models) and further to the organization of the engineering and design phase. A comparative analysis of traditional and new ways of organizing the projecting made it possible to uncover an implicit material order or structure in the process. This uncovering implied a development of a material process perspective. According to this perspective the complexity of the process is rooted in material-functional differentiation. This differentiation presupposes a structuring material (the skeleton of the building) that coordinates the other types of material. Each expert discipline´s competence is related to one or a set of materials. The architect, consulting engineer construction etc. have their competencies related to structuring material, and inherent in this; coordination competence. When dialogues between the disciplines concerning the coordination between them do not result in agreement, the disciplines with responsibility for the structuring material decide the interface issues. Based on these premises, this paper develops a self-organized expert-driven interdisciplinary decision-making system.Keywords: collaboration, complexity, design, engineering, materiality
Procedia PDF Downloads 221868 An Adaptive Dimensionality Reduction Approach for Hyperspectral Imagery Semantic Interpretation
Authors: Akrem Sellami, Imed Riadh Farah, Basel Solaiman
Abstract:
With the development of HyperSpectral Imagery (HSI) technology, the spectral resolution of HSI became denser, which resulted in large number of spectral bands, high correlation between neighboring, and high data redundancy. However, the semantic interpretation is a challenging task for HSI analysis due to the high dimensionality and the high correlation of the different spectral bands. In fact, this work presents a dimensionality reduction approach that allows to overcome the different issues improving the semantic interpretation of HSI. Therefore, in order to preserve the spatial information, the Tensor Locality Preserving Projection (TLPP) has been applied to transform the original HSI. In the second step, knowledge has been extracted based on the adjacency graph to describe the different pixels. Based on the transformation matrix using TLPP, a weighted matrix has been constructed to rank the different spectral bands based on their contribution score. Thus, the relevant bands have been adaptively selected based on the weighted matrix. The performance of the presented approach has been validated by implementing several experiments, and the obtained results demonstrate the efficiency of this approach compared to various existing dimensionality reduction techniques. Also, according to the experimental results, we can conclude that this approach can adaptively select the relevant spectral improving the semantic interpretation of HSI.Keywords: band selection, dimensionality reduction, feature extraction, hyperspectral imagery, semantic interpretation
Procedia PDF Downloads 354867 Experimental Investigation of Partially Premixed Laminar Methane/Air Co-Flow Flames Using Mach-Zehnder Interferometry
Authors: Misagh Irandoost Shahrestani, Mehdi Ashjaee, Shahrokh Zandieh Vakili
Abstract:
In this paper, partially premixed laminar methane/air co-flow flame is studied experimentally. Methane-air flame was established on an axisymmetric coannular burner. The fuel-air jet flows from the central tube while the secondary air flows from the region between the inner and the outer tube. The aim is to investigate the flame features and to develop a nonintrusive method for temperature measurement of methane/air partially premixed flame using Mach-Zehnder interferometry method. Different equivalence ratios and Reynolds numbers are considered. Flame generic visible appearance was also investigated and its various structures were studied. Three distinguished flame regimes were seen based on its appearance. A double flame structure can be seen for the equivalence ratio in the range of 1<Φ<2.1. By adding air to the mixture up to Φ=4 the flame has the characteristics of both premixed and non-premixed flames. Finally for 4<Φ<∞ the flame mainly becomes non-premixed like and the luminous sooting region on its tip is the obvious feature of this type of flames. The Mach-Zehnder method is used to obtain temperature field of a transparent fluid by means of index of refraction. Temperature obtained from optical techniques was compared with that of obtained from thermocouples in order to validate the results. Good agreement was observed for these two methods.Keywords: flame structure, Mach-Zehnder interferometry, partially premixed flame, temperature field
Procedia PDF Downloads 481866 Different Sampling Schemes for Semi-Parametric Frailty Model
Authors: Nursel Koyuncu, Nihal Ata Tutkun
Abstract:
Frailty model is a survival model that takes into account the unobserved heterogeneity for exploring the relationship between the survival of an individual and several covariates. In the recent years, proposed survival models become more complex and this feature causes convergence problems especially in large data sets. Therefore selection of sample from these big data sets is very important for estimation of parameters. In sampling literature, some authors have defined new sampling schemes to predict the parameters correctly. For this aim, we try to see the effect of sampling design in semi-parametric frailty model. We conducted a simulation study in R programme to estimate the parameters of semi-parametric frailty model for different sample sizes, censoring rates under classical simple random sampling and ranked set sampling schemes. In the simulation study, we used data set recording 17260 male Civil Servants aged 40–64 years with complete 10-year follow-up as population. Time to death from coronary heart disease is treated as a survival-time and age, systolic blood pressure are used as covariates. We select the 1000 samples from population using different sampling schemes and estimate the parameters. From the simulation study, we concluded that ranked set sampling design performs better than simple random sampling for each scenario.Keywords: frailty model, ranked set sampling, efficiency, simple random sampling
Procedia PDF Downloads 211865 Effects of Process Parameter Variation on the Surface Roughness of Rapid Prototyped Samples Using Design of Experiments
Authors: R. Noorani, K. Peerless, J. Mandrell, A. Lopez, R. Dalberto, M. Alzebaq
Abstract:
Rapid prototyping (RP) is an additive manufacturing technology used in industry that works by systematically depositing layers of working material to construct larger, computer-modeled parts. A key challenge associated with this technology is that RP parts often feature undesirable levels of surface roughness for certain applications. To combat this phenomenon, an experimental technique called Design of Experiments (DOE) can be employed during the growth procedure to statistically analyze which RP growth parameters are most influential to part surface roughness. Utilizing DOE to identify such factors is important because it is a technique that can be used to optimize a manufacturing process, which saves time, money, and increases product quality. In this study, a four-factor/two level DOE experiment was performed to investigate the effect of temperature, layer thickness, infill percentage, and infill speed on the surface roughness of RP prototypes. Samples were grown using the sixteen different possible growth combinations associated with a four-factor/two level study, and then the surface roughness data was gathered for each set of factors. After applying DOE statistical analysis to these data, it was determined that layer thickness played the most significant role in the prototype surface roughness.Keywords: rapid prototyping, surface roughness, design of experiments, statistical analysis, factors and levels
Procedia PDF Downloads 262864 Elastic Behaviour of Graphene Nanoplatelets Reinforced Epoxy Resin Composites
Authors: V. K. Srivastava
Abstract:
Graphene has recently attracted an increasing attention in nanocomposites applications because it has 200 times greater strength than steel, making it the strongest material ever tested. Graphene, as the fundamental two-dimensional (2D) carbon structure with exceptionally high crystal and electronic quality, has emerged as a rapidly rising star in the field of material science. Graphene, as defined, as a 2D crystal, is composed of monolayers of carbon atoms arranged in a honeycombed network with six-membered rings, which is the interest of both theoretical and experimental researchers worldwide. The name comes from graphite and alkene. Graphite itself consists of many graphite-sheets stacked together by weak van der Waals forces. This is attributed to the monolayer of carbon atoms densely packed into honeycomb structure. Due to superior inherent properties of graphene nanoplatelets (GnP) over other nanofillers, GnP particles were added in epoxy resin with the variation of weight percentage. It is indicated that the DMA results of storage modulus, loss modulus and tan δ, defined as the ratio of elastic modulus and imaginary (loss) modulus versus temperature were affected with addition of GnP in the epoxy resin. In epoxy resin, damping (tan δ) is usually caused by movement of the molecular chain. The tan δ of the graphene nanoplatelets/epoxy resin composite is much lower than that of epoxy resin alone. This finding suggests that addition of graphene nanoplatelets effectively impedes movement of the molecular chain. The decrease in storage modulus can be interpreted by an increasing susceptibility to agglomeration, leading to less energy dissipation in the system under viscoelastic deformation. The results indicates the tan δ increased with the increase of temperature, which confirms that tan δ is associated with magnetic field strength. Also, the results show that the nanohardness increases with increase of elastic modulus marginally. GnP filled epoxy resin gives higher value than the epoxy resin, because GnP improves the mechanical properties of epoxy resin. Debonding of GnP is clearly observed in the micrograph having agglomeration of fillers and inhomogeneous distribution. Therefore, DMA and nanohardness studies indiacte that the elastic modulus of epoxy resin is increased with the addition of GnP fillers.Keywords: agglomeration, elastic modulus, epoxy resin, graphene nanoplatelet, loss modulus, nanohardness, storage modulus
Procedia PDF Downloads 264863 The Potential of Extending the Shelf Life of Meat by Encapsulation with Red Clay
Authors: Onuoha Ogbonnaya Gideon, Ishaq Hafsah Yusuf
Abstract:
Introduction: Meat is a perishable food of good nutrition. Meat ranks among the most significant, nutritious, and favored food items available to most locals. It is a good source of protein (17-19%), depending on sources, and contains appreciable amounts of fat and moisture. However, it has a very short shelf life due mainly to its high moisture, fat, and other nutrient contents. Meat spoilage can result from microbial proliferation as well as inherent enzymes in the meat tissues. Bacteria contamination and permeability to both oxygen and water vapor are major concerns associated with spoilage of meat and its storage. Packaging is fundamental in the preservation and presentation of food. Red clay is a very common substance; hydrous aluminum phyllosilicate, sometimes with varying amounts of iron, magnesium, alkali metals, alkaline earth, and cation formed from sedimentary rocks. Furthermore, red clay is an extremely absorbent material and develops plasticity when wet due to the molecular film of water surrounding the clay particles but can become hard, impervious, brittle, and non-brittle and non-plastic when dry. In developing countries, the high cost of refrigeration technologies and most other methods of preserving meat are exorbitant and thus can be substituted with the less expensive and readily available red clay for the preservation of meat. Methodology: 1000g of lean meat was diced into cubes of 10g each. The sample was then divided into four groups labelled raw meat (RMC); raw in 10% brine solution (RMB), boiled meat (BMC), and fried meat (FMC). It was then encapsulated with 2mm thick red clay and then heated in a muffle furnace at a temperature of 600OC for 30min. The samples were kept on a bench top for 30 days, and a storage study was carried out. Results: Our findings showed a decrease in value during storage for the physiochemical properties of all the sample; pH values decreased [RMC (7.05-7.6), RMB (8.46-7.0), BMC (6.0-5.0), FMC (4.08-3.9)]; free fatty acid content decreased with storage time [RMC (32.6%-31%), RMB (30.2%-28.6%), BMC (30.5%-27.4%), FMC (25.6%-23.8%)]; total soluble solid value decreased [RMC16.20-15.07, RMB (17.22-16.04), BMC (17.05-15.54), FMC (15.3-14.9)]. Conclusion: This result shows that encapsulation with red clay reduced all the values analyzed and thus has the potential to extend the shelf life of stored meat.Keywords: red clay, encapsulating, shelf life, physicochemical properties, lean meat
Procedia PDF Downloads 109