Search results for: technology identification
7233 Modelling and Control of Binary Distillation Column
Authors: Narava Manose
Abstract:
Distillation is a very old separation technology for separating liquid mixtures that can be traced back to the chemists in Alexandria in the first century A. D. Today distillation is the most important industrial separation technology. By the eleventh century, distillation was being used in Italy to produce alcoholic beverages. At that time, distillation was probably a batch process based on the use of just a single stage, the boiler. The word distillation is derived from the Latin word destillare, which means dripping or trickling down. By at least the sixteenth century, it was known that the extent of separation could be improved by providing multiple vapor-liquid contacts (stages) in a so called Rectifactorium. The term rectification is derived from the Latin words rectefacere, meaning to improve. Modern distillation derives its ability to produce almost pure products from the use of multi-stage contacting. Throughout the twentieth century, multistage distillation was by far the most widely used industrial method for separating liquid mixtures of chemical components.The basic principle behind this technique relies on the different boiling temperatures for the various components of the mixture, allowing the separation between the vapor from the most volatile component and the liquid of other(s) component(s). •Developed a simple non-linear model of a binary distillation column using Skogestad equations in Simulink. •We have computed the steady-state operating point around which to base our analysis and controller design. However, the model contains two integrators because the condenser and reboiler levels are not controlled. One particular way of stabilizing the column is the LV-configuration where we use D to control M_D, and B to control M_B; such a model is given in cola_lv.m where we have used two P-controllers with gains equal to 10.Keywords: modelling, distillation column, control, binary distillation
Procedia PDF Downloads 2777232 DNA Isolation and Identification of Virulence Factors of Escherichia coli and Salmonella Species Isolated from Fresh Vegetables in Phnom Penh
Authors: Heng Sreyly, Phoeurk Chanrith
Abstract:
Fresh-eaten vegetables have become more popular in the Cambodian diet. However, according to WHO, these vegetables should be one of the main sources of infection if contaminated with pathogenic microorganisms. The outbreaks of foodborne diseases related to fresh fruits and vegetables have been increasingly reported and raised concerns regarding the safety of these products. Therefore, it is very important to conduct the determination of virulence factors Escherichia coli and Salmonella spp. in fresh vegetables. This study aims to identify virulence strains of Escherichia coli and Salmonella species from fresh vegetables, including cucumber (Cucumis sativus), saw-herb (Eryngium foetidum), and lettuce (Lactuca sativa) from different market and supermarket in Phnom Penh. The PCR method was used to detect the virulence strains of each sample. The results indicate that there are ninety five samples containing extracted DNA among one hundred and three samples. Moreover, the virulence strain of E. coli and salmonella have been found in leafy vegetables (lettuce and saw-herb) much more than in fruit vegetables (cucumber). This research is mainly used to raise public awareness of washing fresh vegetables with clean water more carefully to reduce adverse health impacts.Keywords: DNA, virulence factor, Escherichia coli, Salmonella
Procedia PDF Downloads 307231 Approach for Updating a Digital Factory Model by Photogrammetry
Authors: R. Hellmuth, F. Wehner
Abstract:
Factory planning has the task of designing products, plants, processes, organization, areas, and the construction of a factory. The requirements for factory planning and the building of a factory have changed in recent years. Regular restructuring is becoming more important in order to maintain the competitiveness of a factory. Restrictions in new areas, shorter life cycles of product and production technology as well as a VUCA world (Volatility, Uncertainty, Complexity & Ambiguity) lead to more frequent restructuring measures within a factory. A digital factory model is the planning basis for rebuilding measures and becomes an indispensable tool. Short-term rescheduling can no longer be handled by on-site inspections and manual measurements. The tight time schedules require up-to-date planning models. Due to the high adaptation rate of factories described above, a methodology for rescheduling factories on the basis of a modern digital factory twin is conceived and designed for practical application in factory restructuring projects. The focus is on rebuild processes. The aim is to keep the planning basis (digital factory model) for conversions within a factory up to date. This requires the application of a methodology that reduces the deficits of existing approaches. The aim is to show how a digital factory model can be kept up to date during ongoing factory operation. A method based on photogrammetry technology is presented. The focus is on developing a simple and cost-effective solution to track the many changes that occur in a factory building during operation. The method is preceded by a hardware and software comparison to identify the most economical and fastest variant.Keywords: digital factory model, photogrammetry, factory planning, restructuring
Procedia PDF Downloads 1177230 Adaptive Control of Magnetorheological Damper Using Duffing-Like Model
Authors: Hung-Jiun Chi, Cheng-En Tsai, Jia-Ying Tu
Abstract:
Semi-active control of Magnetorheological (MR) dampers for vibration reduction of structural systems has received considerable attention in civil and earthquake engineering, because the effective stiffness and damping properties of MR fluid can change in a very short time in reaction to external loading, requiring only a low level of power. However, the inherent nonlinear dynamics of hysteresis raise challenges in the modeling and control processes. In order to control the MR damper, an innovative Duffing-like equation is proposed to approximate the hysteresis dynamics in a deterministic and systematic manner than previously has been possible. Then, the model-reference adaptive control technique based on the Duffing-like model and the Lyapunov method is discussed. Parameter identification work with experimental data is presented to show the effectiveness of the Duffing-like model. In addition, simulation results show that the resulting adaptive gains enable the MR damper force to track the desired response of the reference model satisfactorily, verifying the effectiveness of the proposed modeling and control techniques.Keywords: magnetorheological damper, duffing equation, model-reference adaptive control, Lyapunov function, hysteresis
Procedia PDF Downloads 3707229 Turkey Disaster Risk Management System Project (TAFRISK)
Authors: Ahmet Parlak, Celalettin Bilgen
Abstract:
In order to create an effective early warning system, Identification of the risks, preparation and carrying out risk modeling of risk scenarios, taking into account the shortcomings of the old disaster scenarios should be used to improve the system. In the light of this, the importance of risk modeling in creating an effective early warning system is understood. In the scope of TAFRISK project risk modeling trend analysis report on risk modeling developed and a demonstration was conducted for Risk Modeling for flood and mass movements. For risk modeling R&D, studies have been conducted to determine the information, and source of the information, to be gathered, to develop algorithms and to adapt the current algorithms to Turkey’s conditions for determining the risk score in the high disaster risk areas. For each type of the disaster; Disaster Deficit Index (DDI), Local Disaster Index (LDI), Prevalent Vulnerability Index (PVI), Risk Management Index (RMI) have been developed as disaster indices taking danger, sensitivity, fragility, and vulnerability, the physical and economic damage into account in the appropriate scale of the respective type.Keywords: disaster, hazard, risk modeling, sensor
Procedia PDF Downloads 4287228 Production and Characterisation of Lipase from a Novel Streptomyces.sp - Its Molecular Identification
Authors: C. Asha Poorna, N. S. Pradeep
Abstract:
The biological function of lipase is to catalyze the hydrolysis of triacylglycerols to give free fatty acid, diacylglycerols, mono-acylglycerols and glycerol. They constitute the most important group of biocatalysts for biotechnological applications. The aim of the present study was to identify the lipolytic activity of Streptomyces sp. From soil sample collected from the sacred groves of southern Kerala. The culture conditions of the isolate were optimised and the enzyme was purified and characterised. The purification was attempted with acetone precipitation. The isolate observed to have high lipolytic activity and identified to be of Streptomyces strain. The purification was attempted with acetone precipitation. The purified enzyme observed to have an apparent molecular mass of ~60kDa by sodium dodecyl sulphate polyacrylamide gel electrophoresis (SDS-PAGE). The enzyme showed maximum activity at 60oC and pH-8. The lipase showed tolerance towards different organic solvents like ethanol and methanol that are commonly used in transesterification reactions to displace alcohol from triglycerides contained in renewable resources to yield fatty acid alkyl esters known as biodiesel.Keywords: lipase, Streptomyces, biodiesel, fatty acid, transesterification
Procedia PDF Downloads 3277227 Face Recognition Using Eigen Faces Algorithm
Authors: Shweta Pinjarkar, Shrutika Yawale, Mayuri Patil, Reshma Adagale
Abstract:
Face recognition is the technique which can be applied to the wide variety of problems like image and film processing, human computer interaction, criminal identification etc. This has motivated researchers to develop computational models to identify the faces, which are easy and simple to implement. In this, demonstrates the face recognition system in android device using eigenface. The system can be used as the base for the development of the recognition of human identity. Test images and training images are taken directly with the camera in android device.The test results showed that the system produces high accuracy. The goal is to implement model for particular face and distinguish it with large number of stored faces. face recognition system detects the faces in picture taken by web camera or digital camera and these images then checked with training images dataset based on descriptive features. Further this algorithm can be extended to recognize the facial expressions of a person.recognition could be carried out under widely varying conditions like frontal view,scaled frontal view subjects with spectacles. The algorithm models the real time varying lightning conditions. The implemented system is able to perform real-time face detection, face recognition and can give feedback giving a window with the subject's info from database and sending an e-mail notification to interested institutions using android application. Face recognition is the technique which can be applied to the wide variety of problems like image and film processing, human computer interaction, criminal identification etc. This has motivated researchers to develop computational models to identify the faces, which are easy and simple to implement. In this , demonstrates the face recognition system in android device using eigenface. The system can be used as the base for the development of the recognition of human identity. Test images and training images are taken directly with the camera in android device.The test results showed that the system produces high accuracy. The goal is to implement model for particular face and distinguish it with large number of stored faces. face recognition system detects the faces in picture taken by web camera or digital camera and these images then checked with training images dataset based on descriptive features. Further this algorithm can be extended to recognize the facial expressions of a person.recognition could be carried out under widely varying conditions like frontal view,scaled frontal view subjects with spectacles. The algorithm models the real time varying lightning conditions. The implemented system is able to perform real-time face detection, face recognition and can give feedback giving a window with the subject's info from database and sending an e-mail notification to interested institutions using android application.Keywords: face detection, face recognition, eigen faces, algorithm
Procedia PDF Downloads 3617226 Investigating the Determinants and Growth of Financial Technology Depth of Penetration among the Heterogeneous Africa Economies
Authors: Tochukwu Timothy Okoli, Devi Datt Tewari
Abstract:
The high rate of Fintech adoption has not transmitted to greater financial inclusion and development in Africa. This problem is attributed to poor Fintech diversification and usefulness in the continent. This concept is referred to as the Fintech depth of penetration in this study. The study, therefore, assessed its determinants and growth process in a panel of three emergings, twenty-four frontiers and five fragile African economies disaggregated with dummies over the period 2004-2018 to allow for heterogeneity between groups. The System Generalized Method of Moments (GMM) technique reveals that the average depth of Mobile banking and automated teller machine (ATM) is a dynamic heterogeneity process. Moreover, users' previous experiences/compatibility, trial-ability/income, and financial development were the major factors that raise its usefulness, whereas perceived risk, financial openness, and inflation rate significantly limit its usefulness. The growth rate of Mobile banking, ATM, and Internet banking in 2018 is, on average 41.82, 0.4, and 20.8 per cent respectively greater than its average rates in 2004. These greater averages after the 2009 financial crisis suggest that countries resort to Fintech as a risk-mitigating tool. This study, therefore, recommends greater Fintech diversification through improved literacy, institutional development, financial liberalization, and continuous innovation.Keywords: depth of fintech, emerging Africa, financial technology, internet banking, mobile banking
Procedia PDF Downloads 1307225 Development of Fake News Model Using Machine Learning through Natural Language Processing
Authors: Sajjad Ahmed, Knut Hinkelmann, Flavio Corradini
Abstract:
Fake news detection research is still in the early stage as this is a relatively new phenomenon in the interest raised by society. Machine learning helps to solve complex problems and to build AI systems nowadays and especially in those cases where we have tacit knowledge or the knowledge that is not known. We used machine learning algorithms and for identification of fake news; we applied three classifiers; Passive Aggressive, Naïve Bayes, and Support Vector Machine. Simple classification is not completely correct in fake news detection because classification methods are not specialized for fake news. With the integration of machine learning and text-based processing, we can detect fake news and build classifiers that can classify the news data. Text classification mainly focuses on extracting various features of text and after that incorporating those features into classification. The big challenge in this area is the lack of an efficient way to differentiate between fake and non-fake due to the unavailability of corpora. We applied three different machine learning classifiers on two publicly available datasets. Experimental analysis based on the existing dataset indicates a very encouraging and improved performance.Keywords: fake news detection, natural language processing, machine learning, classification techniques.
Procedia PDF Downloads 1677224 The Making of a Community: Perception versus Reality of Neighborhood Resources
Authors: Kirstie Smith
Abstract:
This paper elucidates the value of neighborhood perception as it contributes to the advancement of well-being for individuals and families within a neighborhood. Through in-depth interviews with city residents, this paper examines the degree to which key stakeholders’ (residents) evaluate their neighborhood and perception of resources and identify, access, and utilize local assets existing in the community. Additionally, the research objective included conducting a community inventory that qualified the community assets and resources of lower-income neighborhoods of a medium-sized industrial city. Analysis of the community’s assets was compared with the interview results to allow for a better understanding of the community’s condition. Community mapping revealed the key informants’ reflections of assets were somewhat validated. In each neighborhood, there were more assets mapped than reported in the interviews. Another chief supposition drawn from this study was the identification of key development partners and social networks that offer the potential to facilitate locally-driven community development. Overall, the participants provided invaluable local knowledge of the perception of neighborhood assets, the well-being of residents, the condition of the community, and suggestions for responding to the challenges of the entire community in order to mobilize the present assets and networks.Keywords: community mapping, family, resource allocation, social networks
Procedia PDF Downloads 3537223 Interactive Garments: Flexible Technologies for Textile Integration
Authors: Anupam Bhatia
Abstract:
Upon reviewing the literature and the pragmatic work done in the field of E- textiles, it is observed that the applications of wearable technologies have found a steady growth in the field of military, medical, industrial, sports; whereas fashion is at a loss to know how to treat this technology and bring it to market. The purpose of this paper is to understand the practical issues of integration of electronics in garments; cutting patterns for mass production, maintaining the basic properties of textiles and daily maintenance of garments that hinder the wide adoption of interactive fabric technology within Fashion and leisure wear. To understand the practical hindrances an experimental and laboratory approach is taken. “Techno Meets Fashion” has been an interactive fashion project where sensor technologies have been embedded with textiles that result in set of ensembles that are light emitting garments, sound sensing garments, proximity garments, shape memory garments etc. Smart textiles, especially in the form of textile interfaces, are drastically underused in fashion and other lifestyle product design. Clothing and some other textile products must be washable, which subjects to the interactive elements to water and chemical immersion, physical stress, and extreme temperature. The current state of the art tends to be too fragile for this treatment. The process for mass producing traditional textiles becomes difficult in interactive textiles. As cutting patterns from larger rolls of cloth and sewing them together to make garments breaks and reforms electronic connections in an uncontrolled manner. Because of this, interactive fabric elements are integrated by hand into textiles produced by standard methods. The Arduino has surely made embedding electronics into textiles much easier than before; even then electronics are not integral to the daily wear garments. Soft and flexible interfaces of MEMS (micro sensors and Micro actuators) can be an option to make this possible by blending electronics within E-textiles in a way that’s seamless and still retains functions of the circuits as well as the garment. Smart clothes, which offer simultaneously a challenging design and utility value, can be only mass produced if the demands of the body are taken care of i.e. protection, anthropometry, ergonomics of human movement, thermo- physiological regulation.Keywords: ambient intelligence, proximity sensors, shape memory materials, sound sensing garments, wearable technology
Procedia PDF Downloads 3937222 A Methodological Concept towards a Framework Development for Social Software Adoption in Higher Education System
Authors: Kenneth N. Ohei, Roelien Brink
Abstract:
For decades, teaching and learning processes have centered on the traditional approach (Web 1.0) that promoted teacher-directed pedagogical practices. Currently, there is a realization that the traditional approach is not adequate to effectively address and improve all student-learning outcomes. The subsequent incorporation of social software, Information, and Communication Technology (ICT) tools in universities may serve as complementary to support educational goals, offering students the affordability and opportunity to educational choices and learning platforms. Consequently, educators’ inability to incorporate these instructional ICT tools in their teaching and learning practices remains a challenge. This will signify that educators still lack the ICT skills required to administer lectures and bridging learning gaps. This study probes a methodological concept with the aim of developing a framework towards the adoption of social software in HES to help facilitate business processes and can build social presence among students. A mixed method will be appropriate to develop a comprehensive framework needed in Higher Educational System (HES). After research have been conducted, the adoption of social software will be based on the developed comprehensive framework which is supposed to impact positively on education and approach of delivery, improves learning experience, engagement and finally, increases educational opportunities and easy access to educational contents.Keywords: blended and integrated learning, learning experience and engagement, higher educational system, HES, information and communication technology, ICT, social presence, Web 1.0, Web 2.0, Web 3.0
Procedia PDF Downloads 1577221 CRISPR/Cas9 Based Gene Stacking in Plants for Virus Resistance Using Site-Specific Recombinases
Authors: Sabin Aslam, Sultan Habibullah Khan, James G. Thomson, Abhaya M. Dandekar
Abstract:
Losses due to viral diseases are posing a serious threat to crop production. A quick breakdown of resistance to viruses like Cotton Leaf Curl Virus (CLCuV) demands the application of a proficient technology to engineer durable resistance. Gene stacking has recently emerged as a potential approach for integrating multiple genes in crop plants. In the present study, recombinase technology has been used for site-specific gene stacking. A target vector (pG-Rec) was designed for engineering a predetermined specific site in the plant genome whereby genes can be stacked repeatedly. Using Agrobacterium-mediated transformation, the pG-Rec was transformed into Coker-312 along with Nicotiana tabacum L. cv. Xanthi and Nicotiana benthamiana. The transgene analysis of target lines was conducted through junction PCR. The transgene positive target lines were used for further transformations to site-specifically stack two genes of interest using Bxb1 and PhiC31 recombinases. In the first instance, Cas9 driven by multiplex gRNAs (for Rep gene of CLCuV) was site-specifically integrated into the target lines and determined by the junction PCR and real-time PCR. The resulting plants were subsequently used to stack the second gene of interest (AVP3 gene from Arabidopsis for enhancing cotton plant growth). The addition of the genes is simultaneously achieved with the removal of marker genes for recycling with the next round of gene stacking. Consequently, transgenic marker-free plants were produced with two genes stacked at the specific site. These transgenic plants can be potential germplasm to introduce resistance against various strains of cotton leaf curl virus (CLCuV) and abiotic stresses. The results of the research demonstrate gene stacking in crop plants, a technology that can be used to introduce multiple genes sequentially at predefined genomic sites. The current climate change scenario highlights the use of such technologies so that gigantic environmental issues can be tackled by several traits in a single step. After evaluating virus resistance in the resulting plants, the lines can be a primer to initiate stacking of further genes in Cotton for other traits as well as molecular breeding with elite cotton lines.Keywords: cotton, CRISPR/Cas9, gene stacking, genome editing, recombinases
Procedia PDF Downloads 1557220 Transdisciplinary Pedagogy: An Arts-Integrated Approach to Promote Authentic Science, Technology, Engineering, Arts, and Mathematics Education in Initial Teacher Education
Authors: Anne Marie Morrin
Abstract:
This paper will focus on the design, delivery and assessment of a transdisciplinary STEAM (Science, Technology, Engineering, Arts, and Mathematics) education initiative in a college of education in Ireland. The project explores a transdisciplinary approach to supporting STEAM education where the concepts, methodologies and assessments employed derive from visual art sessions within initial teacher education. The research will demonstrate that the STEAM Education approach is effective when visual art concepts and methods are placed at the core of the teaching and learning experience. Within this study, emphasis is placed on authentic collaboration and transdisciplinary pedagogical approaches with the STEAM subjects. The partners included a combination of teaching expertise in STEM and Visual Arts education, artists, in-service and pre-service teachers and children. The inclusion of all stakeholders mentioned moves towards a more authentic approach where transdisciplinary practice is at the core of the teaching and learning. Qualitative data was collected using a combination of questionnaires (focused and open-ended questions) and focus groups. In addition, the data was collected through video diaries where students reflected on their visual journals and transdisciplinary practice, which gave rich insight into participants' experiences and opinions on their learning. It was found that an effective program of STEAM education integration was informed by co-teaching (continuous professional development), which involved a commitment to adaptable and flexible approaches to teaching, learning, and assessment, as well as the importance of continuous reflection-in-action by all participants. The delivery of a transdisciplinary model of STEAM education was devised to reconceptualizatise how individual subject areas can develop essential skills and tackle critical issues (such as self-care and climate change) through data visualisation and technology. The success of the project can be attributed to the collaboration, which was inclusive, flexible and a willingness between various stakeholders to be involved in the design and implementation of the project from conception to completion. The case study approach taken is particularistic (focusing on the STEAM-ED project), descriptive (providing in-depth descriptions from varied and multiple perspectives), and heuristic (interpreting the participants’ experiences and what meaning they attributed to their experiences).Keywords: collaboration, transdisciplinary, STEAM, visual arts education
Procedia PDF Downloads 497219 HPLC-UV Screening of Legal (Caffeine and Yohimbine) and Illegal (Ephedrine and Sibutramine) Substances from Weight Loss Dietary Supplements for Athletes
Authors: Amelia Tero-Vescan, Camil-Eugen Vari, Laura Ciulea, Cristina Filip, Silvia Imre
Abstract:
A HPLC –UV method for the identification of ephedrine (EPH), sibutramine (SB), yohimbine (Y) and caffeine (CF) was developed. Separation was performed on a Kromasil 100-RP8, 150 mm x 4.6 mm, 5 mm column equipped with a precolumn Kromasil RP 8. Mobile phase was a gradient of 80-35 % sodium dihydrogen phosphate pH=5 with NH4OH and acetonitrile over 15 minutes time of analysis. Based on the responses of 113 athletes about dietary supplements (DS) consumed for "fat burning" and weight loss which have a legal status in Romania, 28 supplements have been selected and investigated for their content in CF, Y, legal substances, and SB, EPH (prohibited substances in DS). The method allows quantitative determination of the four substances in a short analysis time and with minimum cost. The presence of SB and EPH in the analyzed DS was not detected while the content in CF and Y considering the dosage recommended by the manufacturer does not affect the health of the consumers. DS labeling (plant extracts with CF and Y content) allows manufacturers to avoid declaring correct and exact amounts per pharmaceutical form (pure CF or equivalent and Y, respectively).Keywords: dietary supplements, sibutramine, ephedrine, yohimbine, caffeine, HPLC
Procedia PDF Downloads 4427218 Building Data Infrastructure for Public Use and Informed Decision Making in Developing Countries-Nigeria
Authors: Busayo Fashoto, Abdulhakeem Shaibu, Justice Agbadu, Samuel Aiyeoribe
Abstract:
Data has gone from just rows and columns to being an infrastructure itself. The traditional medium of data infrastructure has been managed by individuals in different industries and saved on personal work tools; one of such is the laptop. This hinders data sharing and Sustainable Development Goal (SDG) 9 for infrastructure sustainability across all countries and regions. However, there has been a constant demand for data across different agencies and ministries by investors and decision-makers. The rapid development and adoption of open-source technologies that promote the collection and processing of data in new ways and in ever-increasing volumes are creating new data infrastructure in sectors such as lands and health, among others. This paper examines the process of developing data infrastructure and, by extension, a data portal to provide baseline data for sustainable development and decision making in Nigeria. This paper employs the FAIR principle (Findable, Accessible, Interoperable, and Reusable) of data management using open-source technology tools to develop data portals for public use. eHealth Africa, an organization that uses technology to drive public health interventions in Nigeria, developed a data portal which is a typical data infrastructure that serves as a repository for various datasets on administrative boundaries, points of interest, settlements, social infrastructure, amenities, and others. This portal makes it possible for users to have access to datasets of interest at any point in time at no cost. A skeletal infrastructure of this data portal encompasses the use of open-source technology such as Postgres database, GeoServer, GeoNetwork, and CKan. These tools made the infrastructure sustainable, thus promoting the achievement of SDG 9 (Industries, Innovation, and Infrastructure). As of 6th August 2021, a wider cross-section of 8192 users had been created, 2262 datasets had been downloaded, and 817 maps had been created from the platform. This paper shows the use of rapid development and adoption of technologies that facilitates data collection, processing, and publishing in new ways and in ever-increasing volumes. In addition, the paper is explicit on new data infrastructure in sectors such as health, social amenities, and agriculture. Furthermore, this paper reveals the importance of cross-sectional data infrastructures for planning and decision making, which in turn can form a central data repository for sustainable development across developing countries.Keywords: data portal, data infrastructure, open source, sustainability
Procedia PDF Downloads 977217 Identification of Location Parameters for Different User Types of the Inner-City Building Stock: An Austrian Example
Authors: Bernhard Bauer, Thomas Meixner, Amir Dini, Detlef Heck
Abstract:
The inner city building stock is characterized by different types of buildings of different decades and centuries and different types of historical constructions. Depending on the natural growth of a city, those types are often located in downtown areas and the surrounding suburbs. Since the population is becoming older and the variation of the different social requirements spread with the so-called 'Silver Society', city quarters have to be seen alternatively. If an area is very attractive for young students to live there because of the busy nightlife, it might not be suitable for the older society. To identify 'Location Types A, B, C' for different user groups, qualitative interviews with 24 citizens of the city of Graz (Austria) have been carried out, in order to identify the most important values for making a location or city quarter 'A', 'B', or 'C'. Furthermore these acknowledgements have been put into a softwaretool for predicting locations that are the most suitable for certain user groups. On the other hands side, investors or owners of buildings can use the tool for determining the most suitable user group for the location of their building or construction project in order to adapt the project or building stock to the requirements of the users.Keywords: building stock, location parameters, inner city population, built environment
Procedia PDF Downloads 3137216 The Need for a Consistent Regulatory Framework for CRISPR Gene-Editing in the European Union
Authors: Andrew Thayer, Courtney Rondeau, Paraskevi Papadopoulou
Abstract:
The Clustered Regularly Interspaced Short Palindromic Repeats (CRISPR) gene-editing technologies have generated considerable discussion about the applications and ethics of their use. However, no consistent guidelines for using CRISPR technologies have been developed -nor common legislation passed related to gene editing, especially as it is connected to genetically modified organisms (GMOs) in the European Union. The recent announcement that the first babies with CRISPR-edited genes were born, along with new studies exploring CRISPR’s applications in treating thalassemia, sickle-cell anemia, cancer, and certain forms of blindness, have demonstrated that the technology is developing faster than the policies needed to control it. Therefore, it can be seen that a reasonable and coherent regulatory framework for the use of CRISPR in human somatic and germline cells is necessary to ensure the ethical use of the technology in future years. The European Union serves as a unique region of interconnected countries without a standard set of regulations or legislation for CRISPR gene-editing. We posit that the EU would serve as a suitable model in comparing the legislations of its affiliated countries in order to understand the practicality and effectiveness of adopting majority-approved practices. Additionally, we present a proposed set of guidelines which could serve as a basis in developing a consistent regulatory framework for the EU countries to implement but also act as a good example for other countries to adhere to. Finally, an additional, multidimensional framework of smart solutions is proposed with which all stakeholders are engaged to become better-informed citizens.Keywords: CRISPR, ethics, regulatory framework, European legislation
Procedia PDF Downloads 1357215 The Analysis of Internet and Social Media Behaviors of the Students in Vocational High School
Authors: Mehmet Balci, Sakir Tasdemir, Mustafa Altin, Ozlem Bozok
Abstract:
Our globalizing world has become almost a small village and everyone can access any information at any time. Everyone lets each other know who does whatever in which place. We can learn which social events occur in which place in the world. From the perspective of education, the course notes that a lecturer use in lessons in a university in any state of America can be examined by a student studying in a city of Africa or the Far East. This dizzying communication we have mentioned happened thanks to fast developments in computer technologies and in parallel with this, internet technology. While these developments in the world, has a very large young population and a rapidly evolving electronic communications infrastructure Turkey has been affected by this situation. Researches has shown that almost all young people in Turkey has an account in a social network. Especially becoming common of mobile devices causes data traffic in social networks to increase. In this study, has been surveyed on students in the different age groups and at the Selcuk University Vocational School of Technical Sciences Department of Computer Technology. Student’s opinions about the use of internet and social media has been gotten. Using the Internet and social media skills, purposes, operating frequency, access facilities and tools, social life and effects on vocational education etc. have been explored. Both internet and use of social media positive and negative effects on this department students results have been obtained by the obtained findings evaluating from various aspects. Relations and differences have been found out with statistic.Keywords: computer technologies, internet use, social network, higher vocational school
Procedia PDF Downloads 5427214 Identification and Quantification of Phenolic Compounds In Cassia tora Collected from Three Different Locations Using Ultra High Performance Liquid Chromatography – Electro Spray Ionization – Mass Spectrometry (UHPLC-ESI-MS-MS)
Authors: Shipra Shukla, Gaurav Chaudhary, S. K. Tewari, Mahesh Pal, D. K. Upreti
Abstract:
Cassia tora L. is widely distributed in tropical Asian countries, commonly known as sickle pod. Various parts of the plant are reported for their medicinal value due to presence of anthraquinones, phenolic compounds, emodin, β-sitosterol, and chrysophanol. Therefore a sensitive analytical procedure using UHPLC-ESI-MS/MS was developed and validated for simultaneous quantification of five phenolic compounds in leaf, stem and root extracts of Cassia tora. Rapid chromatographic separation of compounds was achieved on Acquity UHPLC BEH C18 column (50 mm×2.1 mm id, 1.7µm) column in 2.5 min. Quantification was carried out using negative electrospray ionization in multiple-reaction monitoring mode. The method was validated as per ICH guidelines and showed good linearity (r2 ≥ 0.9985) over the concentration range of 0.5-200 ng/mL. The intra- and inter-day precisions and accuracy were within RSDs ≤ 1.93% and ≤ 1.90%, respectively. The developed method was applied to investigate variation of five phenolic compounds in the three geographical collections. Results indicated significant variation among analyzed samples collected from different locations in India.Keywords: Cassia tora, phenolic compounds, quantification, UHPLC-ESI-MS/MS
Procedia PDF Downloads 2697213 Identification of Failures Occurring on a System on Chip Exposed to a Neutron Beam for Safety Applications
Authors: S. Thomet, S. De-Paoli, F. Ghaffari, J. M. Daveau, P. Roche, O. Romain
Abstract:
In this paper, we present a hardware module dedicated to understanding the fail reason of a System on Chip (SoC) exposed to a particle beam. Impact of Single-Event Effects (SEE) on processor-based SoCs is a concern that has increased in the past decade, particularly for terrestrial applications with automotive safety increasing requirements, as well as consumer and industrial domains. The SEE created by the impact of a particle on an SoC may have consequences that can end to instability or crashes. Specific hardening techniques for hardware and software have been developed to make such systems more reliable. SoC is then qualified using cosmic ray Accelerated Soft-Error Rate (ASER) to ensure the Soft-Error Rate (SER) remains in mission profiles. Understanding where errors are occurring is another challenge because of the complexity of operations performed in an SoC. Common techniques to monitor an SoC running under a beam are based on non-intrusive debug, consisting of recording the program counter and doing some consistency checking on the fly. To detect and understand SEE, we have developed a module embedded within the SoC that provide support for recording probes, hardware watchpoints, and a memory mapped register bank dedicated to software usage. To identify CPU failure modes and the most important resources to probe, we have carried out a fault injection campaign on the RTL model of the SoC. Probes are placed on generic CPU registers and bus accesses. They highlight the propagation of errors and allow identifying the failure modes. Typical resulting errors are bit-flips in resources creating bad addresses, illegal instructions, longer than expected loops, or incorrect bus accesses. Although our module is processor agnostic, it has been interfaced to a RISC-V by probing some of the processor registers. Probes are then recorded in a ring buffer. Associated hardware watchpoints are allowing to do some control, such as start or stop event recording or halt the processor. Finally, the module is also providing a bank of registers where the firmware running on the SoC can log information. Typical usage is for operating system context switch recording. The module is connected to a dedicated debug bus and is interfaced to a remote controller via a debugger link. Thus, a remote controller can interact with the monitoring module without any intrusiveness on the SoC. Moreover, in case of CPU unresponsiveness, or system-bus stall, the recorded information can still be recovered, providing the fail reason. A preliminary version of the module has been integrated into a test chip currently being manufactured at ST in 28-nm FDSOI technology. The module has been triplicated to provide reliable information on the SoC behavior. As the primary application domain is automotive and safety, the efficiency of the module will be evaluated by exposing the test chip under a fast-neutron beam by the end of the year. In the meantime, it will be tested with alpha particles and electromagnetic fault injection (EMFI). We will report in the paper on fault-injection results as well as irradiation results.Keywords: fault injection, SoC fail reason, SoC soft error rate, terrestrial application
Procedia PDF Downloads 2297212 Techno-Apocalypse in Christian End-Time Literature
Authors: Sean O'Callaghan
Abstract:
Around 2011/2012, a whole new genre of Christian religious writing began to emerge, focused on the role of advanced technologies, particularly the GRIN technologies (Genetics, Robotics, Information Technology and Nanotechnology), in bringing about a techno-apocalypse, leading to catastrophic events which would usher in the end of the world. This genre, at first niche, has now begun to grow in significance in many quarters of the more fundamentalist and biblically literalist branches of evangelicalism. It approaches science and technology with more than extreme skepticism. It accuses transhumanists of being in league with satanic powers and a satanic agenda and contextualizes transhumanist scientific progress in terms of its service to what it believes to be a soon to come Antichrist figure. The genre has moved beyond literature and videos about its message can be found on YouTube and other forums, where many of the presentations there get well over a quarter of a million views. This paper will examine the genre and its genesis, referring to the key figures involved in spreading the anti-intellectualist and anti-scientific message. It will demonstrate how this genre of writing is similar in many respects to other forms of apocalyptic writing which have emerged in the twentieth and twenty-first centuries, all in response to both scientific and political events which are interpreted in the light of biblical prophecy. It will also set the genre in the context of a contemporary pre-occupation with conspiracy theory. The conclusions of the research conducted in this field by the author are that it does a grave disservice to both the scientific and Christian audiences which it targets, by misrepresenting scientific advances and by creating a hermeneutic of suspicion which makes it impossible for Christians to place their trust in scientific claims.Keywords: antichrist, catastrophic, Christian, techno-apocalypse
Procedia PDF Downloads 2057211 Surface-Enhanced Raman Detection in Chip-Based Chromatography via a Droplet Interface
Authors: Renata Gerhardt, Detlev Belder
Abstract:
Raman spectroscopy has attracted much attention as a structurally descriptive and label-free detection method. It is particularly suited for chemical analysis given as it is non-destructive and molecules can be identified via the fingerprint region of the spectra. In this work possibilities are investigated how to integrate Raman spectroscopy as a detection method for chip-based chromatography, making use of a droplet interface. A demanding task in lab-on-a-chip applications is the specific and sensitive detection of low concentrated analytes in small volumes. Fluorescence detection is frequently utilized but restricted to fluorescent molecules. Furthermore, no structural information is provided. Another often applied technique is mass spectrometry which enables the identification of molecules based on their mass to charge ratio. Additionally, the obtained fragmentation pattern gives insight into the chemical structure. However, it is only applicable as an end-of-the-line detection because analytes are destroyed during measurements. In contrast to mass spectrometry, Raman spectroscopy can be applied on-chip and substances can be processed further downstream after detection. A major drawback of Raman spectroscopy is the inherent weakness of the Raman signal, which is due to the small cross-sections associated with the scattering process. Enhancement techniques, such as surface enhanced Raman spectroscopy (SERS), are employed to overcome the poor sensitivity even allowing detection on a single molecule level. In SERS measurements, Raman signal intensity is improved by several orders of magnitude if the analyte is in close proximity to nanostructured metal surfaces or nanoparticles. The main gain of lab-on-a-chip technology is the building block-like ability to seamlessly integrate different functionalities, such as synthesis, separation, derivatization and detection on a single device. We intend to utilize this powerful toolbox to realize Raman detection in chip-based chromatography. By interfacing on-chip separations with a droplet generator, the separated analytes are encapsulated into numerous discrete containers. These droplets can then be injected with a silver nanoparticle solution and investigated via Raman spectroscopy. Droplet microfluidics is a sub-discipline of microfluidics which instead of a continuous flow operates with the segmented flow. Segmented flow is created by merging two immiscible phases (usually an aqueous phase and oil) thus forming small discrete volumes of one phase in the carrier phase. The study surveys different chip designs to realize coupling of chip-based chromatography with droplet microfluidics. With regards to maintaining a sufficient flow rate for chromatographic separation and ensuring stable eluent flow over the column different flow rates of eluent and oil phase are tested. Furthermore, the detection of analytes in droplets with surface enhanced Raman spectroscopy is examined. The compartmentalization of separated compounds preserves the analytical resolution since the continuous phase restricts dispersion between the droplets. The droplets are ideal vessels for the insertion of silver colloids thus making use of the surface enhancement effect and improving the sensitivity of the detection. The long-term goal of this work is the first realization of coupling chip based chromatography with droplets microfluidics to employ surface enhanced Raman spectroscopy as means of detection.Keywords: chip-based separation, chip LC, droplets, Raman spectroscopy, SERS
Procedia PDF Downloads 2457210 Thermography Evaluation on Facial Temperature Recovery after Elastic Gum
Authors: A. Dionísio, L. Roseiro, J. Fonseca, P. Nicolau
Abstract:
Thermography is a non-radiating and contact-free technology which can be used to monitor skin temperature. The efficiency and safety of thermography technology make it a useful tool for detecting and locating thermal changes in skin surface, characterized by increases or decreases in temperature. This work intends to be a contribution for the use of thermography as a methodology for evaluation of skin temperature in the context of orofacial biomechanics. The study aims to identify the oscillations of skin temperature in the left and right hemiface regions of the masseter muscle, during and after thermal stimulus, and estimate the time required to restore the initial temperature after the application of the stimulus. Using a FLIR T430sc camera, a data acquisition protocol was followed with a group of eight volunteers, aged between 22 and 27 years. The tests were performed in a controlled environment with the volunteers in a comfortably static position. The thermal stimulus involves the use of an ice volume with controlled size and contact surface. The skin surface temperature was recorded in two distinct situations, namely without further stimulus and with the additions of a stimulus obtained by a chewing gum. The data obtained were treated using FLIR Research IR Max software. The time required to recover the initial temperature ranged from 20 to 52 minutes when no stimulus was added and varied between 8 and 26 minutes with the chewing gum stimulus. These results show that recovery is faster with the addition of the stimulus and may guide clinicians regarding the pre and post-operative times with ice therapy, in the presence or absence of mechanical stimulus that increases muscle functions (e.g. phonetics or mastication).Keywords: thermography, orofacial biomechanics, skin temperature, ice therapy
Procedia PDF Downloads 2557209 Study on Measuring Method and Experiment of Arc Fault Detection Device
Authors: Yang Jian-Hong, Zhang Ren-Cheng, Huang Li
Abstract:
Arc fault is one of the main inducements of electric fires. Arc Fault Detection Device (AFDD) can detect arc fault effectively. Arc fault detections and unhooking standards are the keys to AFDD practical application. First, an arc fault continuous production system was developed, which could count the arc half wave number. Then, Combining with the UL1699 standard, ignition probability curve of cotton and unhooking time of various currents intensity were obtained by experiments. The combustion degree of arc fault could be expressed effectively by arc area. Experiments proved that electric fires would be misjudged or missed only using arc half wave number as AFDD unhooking basis. At last, Practical tests were carried out on the self-developed AFDD system. The result showed that actual AFDD unhooking time was the sum of arc half wave cycling number, Arc wave identification time and unhooking mechanical operation time And the first two shared shorter time. Unhooking time standard depended on the shortest mechanical operation time.Keywords: arc fault detection device, arc area, arc half wave, unhooking time, arc fault
Procedia PDF Downloads 5097208 Role of Collaborative Cultural Model to Step on Cleaner Energy: A Case of Kathmandu City Core
Authors: Bindu Shrestha, Sudarshan R. Tiwari, Sushil B. Bajracharya
Abstract:
Urban household cooking fuel choice is highly influenced by human behavior and energy culture parameters such as cognitive norms, material culture and practices. Although these parameters have a leading role in Kathmandu for cleaner households, they are not incorporated in the city’s energy policy. This paper aims to identify trade-offs to transform resident behavior in cooking pattern towards cleaner technology from the questionnaire survey, observation, mapping, interview, and quantitative analysis. The analysis recommends implementing a Collaborative Cultural Model (CCM) for changing impact on the neighborhood from the policy level. The results showed that each household produces 439.56 kg of carbon emission each year and 20 percent used unclean technology due to low-income level. Residents who used liquefied petroleum gas (LPG) as their cooking fuel suffered from an energy crisis every year that has created fuel hoarding, which ultimately creates more energy demand and carbon exposure. In conclusion, the carbon emission can be reduced by improving the residents’ energy consumption culture. It recommended the city to use holistic action of changing habits as soft power of collaboration in two-way participation approach within residents, private sectors, and government to change their energy culture and behavior in policy level.Keywords: energy consumption pattern, collaborative cultural model, energy culture, fuel stacking
Procedia PDF Downloads 1347207 Effect of Printing Process on Mechanical Properties of Interface between 3D Printed Concrete Strips
Authors: Wei Chen, Jinlong Pan
Abstract:
3D concrete printing technology is a novel and highly efficient construction method that holds significant promise for advancing low-carbon initiatives within the construction industry. In contrast to traditional construction practices, 3D printing offers a manual and formwork-free approach, resulting in a transformative shift in labor requirements and fabrication techniques. This transition yields substantial reductions in carbon emissions during the construction phase, as well as decreased on-site waste generation. Furthermore, when compared to conventionally printed concrete, 3D concrete exhibits mechanical anisotropy due to its layer-by-layer construction methodology. Therefore, it becomes imperative to investigate the influence of the printing process on the mechanical properties of 3D printed strips and to optimize the mechanical characteristics of these coagulated strips. In this study, we conducted three-dimensional reconstructions of printed blocks using both circular and directional print heads, incorporating various overlap distances between strips, and employed CT scanning for comprehensive analysis. Our research focused on assessing mechanical properties and micro-pore characteristics under different loading orientations. Our findings reveal that increasing the overlap degree between strips leads to enhanced mechanical properties of the strips. However, it's noteworthy that once full overlap is achieved, further increases in the degree of coincidence do not lead to a decrease in porosity between strips. Additionally, due to its superior printing cross-sectional area, the square printing head exhibited the most favorable impact on mechanical properties.Keywords: 3D printing concrete, mechanical anisotropy, micro-pore structure, printing technology
Procedia PDF Downloads 937206 Investigation on Performance of Change Point Algorithm in Time Series Dynamical Regimes and Effect of Data Characteristics
Authors: Farhad Asadi, Mohammad Javad Mollakazemi
Abstract:
In this paper, Bayesian online inference in models of data series are constructed by change-points algorithm, which separated the observed time series into independent series and study the change and variation of the regime of the data with related statistical characteristics. variation of statistical characteristics of time series data often represent separated phenomena in the some dynamical system, like a change in state of brain dynamical reflected in EEG signal data measurement or a change in important regime of data in many dynamical system. In this paper, prediction algorithm for studying change point location in some time series data is simulated. It is verified that pattern of proposed distribution of data has important factor on simpler and smother fluctuation of hazard rate parameter and also for better identification of change point locations. Finally, the conditions of how the time series distribution effect on factors in this approach are explained and validated with different time series databases for some dynamical system.Keywords: time series, fluctuation in statistical characteristics, optimal learning, change-point algorithm
Procedia PDF Downloads 4267205 Artificial Intelligence in Art and Other Sectors: Selected Aspects of Mutual Impact
Authors: Justyna Minkiewicz
Abstract:
Artificial Intelligence (AI) applied in the arts may influence the development of AI knowledge in other sectors and then also impact mutual collaboration with the artistic environment. Hence this collaboration may also impact the development of art projects. The paper will reflect the qualitative research outcomes based on in-depth (IDI) interviews within the marketing sector in Poland and desk research. Art is a reflection of the spirit of our times. Moreover, now we are experiencing a significant acceleration in the development of technologies and their use in various sectors. The leading technologies that contribute to the development of the economy, including the creative sector, embrace technologies such as artificial intelligence, blockchain, extended reality, voice processing, and virtual beings. Artificial intelligence is one of the leading technologies developed for several decades, which is currently reaching a high level of interest and use in various sectors. However, the conducted research has shown that there is still low awareness of artificial intelligence and its wide application in various sectors. The study will show how artists use artificial intelligence in their art projects and how it can be translated into practice within the business. At the same time, the paper will raise awareness of the need for businesses to be inspired by the artistic environment. The research proved that there is still a need to popularize knowledge about this technology which is crucial for many sectors. Art projects are tools to develop knowledge and awareness of society and also various sectors. At the same time, artists may benefit from such collaboration. The paper will include selected aspects of mutual relations, areas of possible inspiration, and possible transfers of technological solutions. Those are AI applications in creative industries such as advertising and film, image recognition in art, and projects from different sectors.Keywords: artificial intelligence, business, art, creative industry, technology
Procedia PDF Downloads 1057204 Brain-Computer Interface Based Real-Time Control of Fixed Wing and Multi-Rotor Unmanned Aerial Vehicles
Authors: Ravi Vishwanath, Saumya Kumaar, S. N. Omkar
Abstract:
Brain-computer interfacing (BCI) is a technology that is almost four decades old, and it was developed solely for the purpose of developing and enhancing the impact of neuroprosthetics. However, in the recent times, with the commercialization of non-invasive electroencephalogram (EEG) headsets, the technology has seen a wide variety of applications like home automation, wheelchair control, vehicle steering, etc. One of the latest developed applications is the mind-controlled quadrotor unmanned aerial vehicle. These applications, however, do not require a very high-speed response and give satisfactory results when standard classification methods like Support Vector Machine (SVM) and Multi-Layer Perceptron (MLPC). Issues are faced when there is a requirement for high-speed control in the case of fixed-wing unmanned aerial vehicles where such methods are rendered unreliable due to the low speed of classification. Such an application requires the system to classify data at high speeds in order to retain the controllability of the vehicle. This paper proposes a novel method of classification which uses a combination of Common Spatial Paradigm and Linear Discriminant Analysis that provides an improved classification accuracy in real time. A non-linear SVM based classification technique has also been discussed. Further, this paper discusses the implementation of the proposed method on a fixed-wing and VTOL unmanned aerial vehicles.Keywords: brain-computer interface, classification, machine learning, unmanned aerial vehicles
Procedia PDF Downloads 283