Search results for: speech-recognition technology
894 Systematic Mapping Study of Digitization and Analysis of Manufacturing Data
Authors: R. Clancy, M. Ahern, D. O’Sullivan, K. Bruton
Abstract:
The manufacturing industry is currently undergoing a digital transformation as part of the mega-trend Industry 4.0. As part of this phase of the industrial revolution, traditional manufacturing processes are being combined with digital technologies to achieve smarter and more efficient production. To successfully digitally transform a manufacturing facility, the processes must first be digitized. This is the conversion of information from an analogue format to a digital format. The objective of this study was to explore the research area of digitizing manufacturing data as part of the worldwide paradigm, Industry 4.0. The formal methodology of a systematic mapping study was utilized to capture a representative sample of the research area and assess its current state. Specific research questions were defined to assess the key benefits and limitations associated with the digitization of manufacturing data. Research papers were classified according to the type of research and type of contribution to the research area. Upon analyzing 54 papers identified in this area, it was noted that 23 of the papers originated in Germany. This is an unsurprising finding as Industry 4.0 is originally a German strategy with supporting strong policy instruments being utilized in Germany to support its implementation. It was also found that the Fraunhofer Institute for Mechatronic Systems Design, in collaboration with the University of Paderborn in Germany, was the most frequent contributing Institution of the research papers with three papers published. The literature suggested future research directions and highlighted one specific gap in the area. There exists an unresolved gap between the data science experts and the manufacturing process experts in the industry. The data analytics expertise is not useful unless the manufacturing process information is utilized. A legitimate understanding of the data is crucial to perform accurate analytics and gain true, valuable insights into the manufacturing process. There lies a gap between the manufacturing operations and the information technology/data analytics departments within enterprises, which was borne out by the results of many of the case studies reviewed as part of this work. To test the concept of this gap existing, the researcher initiated an industrial case study in which they embedded themselves between the subject matter expert of the manufacturing process and the data scientist. Of the papers resulting from the systematic mapping study, 12 of the papers contributed a framework, another 12 of the papers were based on a case study, and 11 of the papers focused on theory. However, there were only three papers that contributed a methodology. This provides further evidence for the need for an industry-focused methodology for digitizing and analyzing manufacturing data, which will be developed in future research.Keywords: analytics, digitization, industry 4.0, manufacturing
Procedia PDF Downloads 113893 The Risk of Prioritizing Management over Education at Japanese Universities
Authors: Masanori Kimura
Abstract:
Due to the decline of the 18-year-old population, Japanese universities have a tendency to convert their form of employment from tenured positions to fixed-term positions for newly hired teachers. The advantage of this is that universities can be more flexible in their employment plans in case they fail to fill the enrollment of quotas of prospective students or they need to supplement teachers who can engage in other academic fields or research areas where new demand is expected. The most serious disadvantage of this, however, is that if secure positions cannot be provided to faculty members, there is the possibility that coherence of education and continuity of research supported by the university cannot be achieved. Therefore, the question of this presentation is as follows: Are universities aiming to give first priority to management, or are they trying to prioritize educational and research rather than management? To answer this question, the author examined the number of job offerings for college foreign language teachers posted on the JREC-IN (Japan Research Career Information Network, which is run by Japan Science and Technology Agency) website from April 2012 to October 2015. The results show that there were 1,002 and 1,056 job offerings for tenured positions and fixed-term contracts respectively, suggesting that, overall, today’s Japanese universities show a tendency to give first priority to management. More detailed examinations of the data, however, show that the tendency slightly varies depending on the types of universities. National universities which are supported by the central government and state universities which are supported by local governments posted more job offerings for tenured positions than for fixed-term contracts: national universities posted 285 and 257 job offerings for tenured positions and fixed-term contracts respectively, and state universities posted 106 and 86 job offerings for tenured positions and fixed-term contracts respectively. Yet the difference in number between the two types of employment status at national and state universities is marginal. As for private universities, they posted 713 job offerings for fixed-term contracts and 616 offerings for tenured positions. Moreover, 73% of the fixed-term contracts were offered for low rank positions including associate professors, lectures, and so forth. Generally speaking, those positions are offered to younger teachers. Therefore, this result indicates that private universities attempt to cut their budgets yet expect the same educational effect by hiring younger teachers. Although the results have shown that there are some differences in personal strategies among the three types of universities, the author argues that all three types of universities may lose important human resources that will take a pivotal role at their universities in the future unless they urgently review their employment strategies.Keywords: higher education, management, employment status, foreign language education
Procedia PDF Downloads 134892 Regular Laboratory Based Neonatal Simulation Program Increases Senior Clinicians’ Knowledge, Skills and Confidence Caring for Sick Neonates
Authors: Madeline Tagg, Choihoong Mui, Elizabeth Lek, Jide Menakaya
Abstract:
Introduction: Simulation technology is used by neonatal teams to learn and refresh skills and gain the knowledge and confidence to care for sick neonates. In-situ simulation is considered superior to laboratory-based programmes as it closely mirrors real life situations. This study reports our experience of running regular laboratory-based simulation sessions for senior clinicians and nurses and its impact on their knowledge, skills and confidence. Methods: A before and after questionnaire survey was carried out on senior clinicians and nurses that attended a scheduled laboratory-based simulation session. Participants were asked to document their expectations before a 3-hour monthly laboratory programme started and invited to feedback their reflections at the end of the session. The session included discussion of relevant clinical guidelines, immersion in a scenario and video led debrief. The results of the survey were analysed in three skills based categories - improved, no change or a worsened experience. Results: 45 questionnaires were completed and analysed. Of these 25 (55%) were completed by consultants seven and six by nurses and trainee doctors respectively, and seven respondents were unknown. 40 (88%) rated the session overall and guideline review as good/excellent, 39 respondents (86%) rated the scenario session good/excellent and 40/45 fed back a good/excellent debrief session. 33 (73%) respondents completed the before and after questionnaire. 21/33 (63%) reflected an improved knowledge, skill or confidence in caring for sick new-bon babies, eight respondents reported no change and four fed back a worse experience after the session. Discussion: Most respondents found the laboratory based structured simulation session beneficial for their professional development. They valued equally the whole content of the programme such as guideline review and equipment training as well as the simulation and debrief sessions. Two out three participants stated their knowledge of caring for sick new-born babies had been transformed positively by the session. Sessions where simulation equipment failed or relevant staff were absent contributed to a poor educational experience. Summary: A regular structured laboratory-based simulation programme with a rich content is a credible educational resource for improving the knowledge, skills and confidence of senior clinicians caring for sick new born babies.Keywords: knowledge, laboratory based, neonates, simulation
Procedia PDF Downloads 122891 Delving into the Concept of Social Capital in the Smart City Research
Authors: Atefe Malekkhani, Lee Beattie, Mohsen Mohammadzadeh
Abstract:
Unprecedented growth of megacities and urban areas all around the world have resulted in numerous risks, concerns, and problems across various aspects of urban life, including environmental, social, and economic domains like climate change, spatial and social inequalities. In this situation, ever-increasing progress of technology has created a hope for urban authorities that the negative effects of various socio-economic and environmental crises can potentially be mitigated with the use of information and communication technologies. The concept of 'smart city' represents an emerging solution to urban challenges arising from increased urbanization using ICTs. However, smart cities are often perceived primarily as technological initiatives and are implemented without considering the social and cultural contexts of cities and the needs of their residents. The implementation of smart city projects and initiatives has the potential to (un)intentionally exacerbate pre-existing social, spatial, and cultural segregation. Investigating the impact of smart city on social capital of people who are users of smart city systems and with governance as policymakers is worth exploring. The importance of inhabitants to the existence and development of smart cities cannot be overlooked. This concept has gained different perspectives in the smart city studies. Reviewing the literature about social capital and smart city show that social capital play three different roles in smart city development. Some research indicates that social capital is a component of a smart city and has embedded in its dimensions, definitions, or strategies, while other ones see it as a social outcome of smart city development and point out that the move to smart cities improves social capital; however, in most cases, it remains an unproven hypothesis. Other studies show that social capital can enhance the functions of smart cities, and the consideration of social capital in planning smart cities should be promoted. Despite the existing theoretical and practical knowledge, there is a significant research gap reviewing the knowledge domain of smart city studies through the lens of social capital. To shed light on this issue, this study aims to explore the domain of existing research in the field of smart city through the lens of social capital. This research will use the 'Preferred Reporting Items for Systematic Reviews and Meta-Analyses' (PRISMA) method to review relevant literature, focusing on the key concepts of 'Smart City' and 'Social Capital'. The studies will be selected Web of Science Core Collection, using a selection process that involves identifying literature sources, screening and filtering studies based on titles, abstracts, and full-text reading.Keywords: smart city, urban digitalisation, ICT, social capital
Procedia PDF Downloads 15890 Repurposing Dairy Manure Solids as a Non- Polluting Fertilizer and the Effects on Nutrient Recovery in Tomatoes (Solanum Lycopersicum)
Authors: Devon Simpson
Abstract:
Recycled Manure Solids (RMS), attained via centrifugation from Canadian dairy farms, were synthesized into a non-polluting fertilizer by bonding micronutrients (Fe, Zn, and Mn) to cellulose fibers and then assessed for the effectiveness of nutrient recovery in tomatoes. Manure management technology is critical for improving the sustainability of agroecosystems and has the capacity to offer a truly circular economy. The ability to add value to manure byproducts offers an opportunity for economic benefits while generating tenable solutions to livestock waste. The dairy industry is under increasing pressure from new environmental protections such as government restrictions on manure applications, limitations on herd size as well as increased product demand from a growing population. Current systems use RMS as bedding, so there is a lack of data pertaining to RMS use as a fertilizer. This is because of nutrient distribution, where most nutrients are retained in the liquid effluent of the solid-liquid separation. A literature review on the physical and chemical properties of dairy manure further revealed more data for raw manure than centrifuged solids. This research offers an innovative perspective and a new avenue of exploration in the use of RMS. Manure solids in this study were obtained directly from dairy farms in Salmon Arm and Abbotsford, British Columbia, and underwent physical, chemical, and biological characterizations pre- and post-synthesis processing. Samples were sent to A&L labs Canada for analysis. Once characterized and bonded to micronutrients, the effect of synthesized RMS on nutrient recovery in tomatoes was studied in a greenhouse environment. The agricultural research package ‘agricolae’ for R was used for experimental design and data analysis. The growth trials consisted of a randomized complete block design (RCBD) that allowed for analysis of variance (ANOVA). The primary outcome was to measure nutrient uptake, and this was done using an Inductively Coupled Plasma Mass Spectrometer (IC-PMS) to analyze the micronutrient content of both the tissue and fruit of the tomatoes. It was found that treatments containing bonded dairy manure solids had an increased micronutrient concentration. Treatments with bonded dairy manure solids also saw an increase in yield, and a brix analysis showed higher sugar content than the untreated control and a grower standard.Keywords: aoecosystems, dairy manure, micronutrient fertilizer, manure management, nutrient recovery, nutrient recycling, recycled manure solids, regenerative agricugrlture, sustainable farming
Procedia PDF Downloads 194889 Two-Stage Estimation of Tropical Cyclone Intensity Based on Fusion of Coarse and Fine-Grained Features from Satellite Microwave Data
Authors: Huinan Zhang, Wenjie Jiang
Abstract:
Accurate estimation of tropical cyclone intensity is of great importance for disaster prevention and mitigation. Existing techniques are largely based on satellite imagery data, and research and utilization of the inner thermal core structure characteristics of tropical cyclones still pose challenges. This paper presents a two-stage tropical cyclone intensity estimation network based on the fusion of coarse and fine-grained features from microwave brightness temperature data. The data used in this network are obtained from the thermal core structure of tropical cyclones through the Advanced Technology Microwave Sounder (ATMS) inversion. Firstly, the thermal core information in the pressure direction is comprehensively expressed through the maximal intensity projection (MIP) method, constructing coarse-grained thermal core images that represent the tropical cyclone. These images provide a coarse-grained feature range wind speed estimation result in the first stage. Then, based on this result, fine-grained features are extracted by combining thermal core information from multiple view profiles with a distributed network and fused with coarse-grained features from the first stage to obtain the final two-stage network wind speed estimation. Furthermore, to better capture the long-tail distribution characteristics of tropical cyclones, focal loss is used in the coarse-grained loss function of the first stage, and ordinal regression loss is adopted in the second stage to replace traditional single-value regression. The selection of tropical cyclones spans from 2012 to 2021, distributed in the North Atlantic (NA) regions. The training set includes 2012 to 2017, the validation set includes 2018 to 2019, and the test set includes 2020 to 2021. Based on the Saffir-Simpson Hurricane Wind Scale (SSHS), this paper categorizes tropical cyclone levels into three major categories: pre-hurricane, minor hurricane, and major hurricane, with a classification accuracy rate of 86.18% and an intensity estimation error of 4.01m/s for NA based on this accuracy. The results indicate that thermal core data can effectively represent the level and intensity of tropical cyclones, warranting further exploration of tropical cyclone attributes under this data.Keywords: Artificial intelligence, deep learning, data mining, remote sensing
Procedia PDF Downloads 63888 Improving Student Retention: Enhancing the First Year Experience through Group Work, Research and Presentation Workshops
Authors: Eric Bates
Abstract:
Higher education is recognised as being of critical importance in Ireland and has been linked as a vital factor to national well-being. Statistics show that Ireland has one of the highest rates of higher education participation in Europe. However, student retention and progression, especially in Institutes of Technology, is becoming an issue as rates on non-completion rise. Both within Ireland and across Europe student retention is seen as a key performance indicator for higher education and with these increasing rates the Irish higher education system needs to be flexible and adapt to the situation it now faces. The author is a Programme Chair on a Level 6 full time undergraduate programme and experience to date has shown that the first year undergraduate students take some time to identify themselves as a group within the setting of a higher education institute. Despite being part of a distinct class on a specific programme some individuals can feel isolated as he or she take the first step into higher education. Such feelings can contribute to students eventually dropping out. This paper reports on an ongoing initiative that aims to accelerate the bonding experience of a distinct group of first year undergraduates on a programme which has a high rate of non-completion. This research sought to engage the students in dynamic interactions with their peers to quickly evolve a group sense of coherence. Two separate modules – a Research Module and a Communications module - delivered by the researcher were linked across two semesters. Students were allocated into random groups and each group was given a topic to be researched. There were six topics – essentially the six sub-headings on the DIT Graduate Attribute Statement. The research took place in a computer lab and students also used the library. The output from this was a document that formed part of the submission for the Research Module. In the second semester the groups then had to make a presentation of their findings where each student spoke for a minimum amount of time. Presentation workshops formed part of that module and students were given the opportunity to practice their presentation skills. These presentations were video recorded to enable feedback to be given. Although this was a small scale study preliminary results found a strong sense of coherence among this particular cohort and feedback from the students was very positive. Other findings indicate that spreading the initiative across two semesters may have been an inhibitor. Future challenges include spreading such Initiatives College wide and indeed sector wide.Keywords: first year experience, student retention, group work, presentation workshops
Procedia PDF Downloads 229887 Photocatalytic Disintegration of Naphthalene and Naphthalene Similar Compounds in Indoors Air
Authors: Tobias Schnabel
Abstract:
Naphthalene and naphthalene similar compounds are a common problem in the indoor air of buildings from the 1960s and 1970s in Germany. Often tar containing roof felt was used under the concrete floor to prevent humidity to come through the floor. This tar containing roof felt has high concentrations of PAH (Polycyclic aromatic hydrocarbon) and naphthalene. Naphthalene easily evaporates and contaminates the indoor air. Especially after renovations and energetically modernization of the buildings, the naphthalene concentration rises because no forced air exchange can happen. Because of this problem, it is often necessary to change the floors after renovation of the buildings. The MFPA Weimar (Material research and testing facility) developed in cooperation a project with LEJ GmbH and Reichmann Gebäudetechnik GmbH. It is a technical solution for the disintegration of naphthalene in naphthalene, similar compounds in indoor air with photocatalytic reforming. Photocatalytic systems produce active oxygen species (hydroxyl radicals) through trading semiconductors on a wavelength of their bandgap. The light energy separates the charges in the semiconductor and produces free electrons in the line tape and defect electrons. The defect electrons can react with hydroxide ions to hydroxyl radicals. The produced hydroxyl radicals are a strong oxidation agent, and can oxidate organic matter to carbon dioxide and water. During the research, new titanium oxide catalysator surface coatings were developed. This coating technology allows the production of very porous titan oxide layer on temperature stable carrier materials. The porosity allows the naphthalene to get easily absorbed by the surface coating, what accelerates the reaction of the heterogeneous photocatalysis. The photocatalytic reaction is induced by high power and high efficient UV-A (ultra violet light) Leds with a wavelength of 365nm. Various tests in emission chambers and on the reformer itself show that a reduction of naphthalene in important concentrations between 2 and 250 µg/m³ is possible. The disintegration rate was at least 80%. To reduce the concentration of naphthalene from 30 µg/m³ to a level below 5 µg/m³ in a usual 50 ² classroom, an energy of 6 kWh is needed. The benefits of the photocatalytic indoor air treatment are that every organic compound in the air can be disintegrated and reduced. The use of new photocatalytic materials in combination with highly efficient UV leds make a safe and energy efficient reduction of organic compounds in indoor air possible. At the moment the air cleaning systems take the step from prototype stage into the usage in real buildings.Keywords: naphthalene, titandioxide, indoor air, photocatalysis
Procedia PDF Downloads 144886 Technical and Economic Potential of Partial Electrification of Railway Lines
Authors: Rafael Martins Manzano Silva, Jean-Francois Tremong
Abstract:
Electrification of railway lines allows to increase speed, power, capacity and energetic efficiency of rolling stocks. However, this process of electrification is complex and costly. An electrification project is not just about design of catenary. It also includes installation of structures around electrification, as substation installation, electrical isolation, signalling, telecommunication and civil engineering structures. France has more than 30,000 km of railways, whose only 53% are electrified. The others 47% of railways use diesel locomotive and represent only 10% of the circulation (tons.km). For this reason, a new type of electrification, less expensive than the usual, is requested to enable the modernization of these railways. One solution could be the use of hybrids trains. This technology opens up new opportunities for less expensive infrastructure development such as the partial electrification of railway lines. In a partially electrified railway, the power supply of theses hybrid trains could be made either by the catenary or by the on-board energy storage system (ESS). Thus, the on-board ESS would feed the energetic needs of the train along the non-electrified zones while in electrified zones, the catenary would feed the train and recharge the on-board ESS. This paper’s objective deals with the technical and economic potential identification of partial electrification of railway lines. This study provides different scenarios of electrification by replacing the most expensive places to electrify using on-board ESS. The target is to reduce the cost of new electrification projects, i.e. reduce the cost of electrification infrastructures while not increasing the cost of rolling stocks. In this study, scenarios are constructed in function of the electrification’s cost of each structure. The electrification’s cost varies considerably because of the installation of catenary support in tunnels, bridges and viaducts is much more expensive than in others zones of the railway. These scenarios will be used to describe the power supply system and to choose between the catenary and the on-board energy storage depending on the position of the train on the railway. To identify the influence of each partial electrification scenario in the sizing of the on-board ESS, a model of the railway line and of the rolling stock is developed for a real case. This real case concerns a railway line located in the south of France. The energy consumption and the power demanded at each point of the line for each power supply (catenary or on-board ESS) are provided at the end of the simulation. Finally, the cost of a partial electrification is obtained by adding the civil engineering costs of the zones to be electrified plus the cost of the on-board ESS. The study of the technical and economic potential ends with the identification of the most economically interesting scenario of electrification.Keywords: electrification, hybrid, railway, storage
Procedia PDF Downloads 431885 Predicting Photovoltaic Energy Profile of Birzeit University Campus Based on Weather Forecast
Authors: Muhammad Abu-Khaizaran, Ahmad Faza’, Tariq Othman, Yahia Yousef
Abstract:
This paper presents a study to provide sufficient and reliable information about constructing a Photovoltaic energy profile of the Birzeit University campus (BZU) based on the weather forecast. The developed Photovoltaic energy profile helps to predict the energy yield of the Photovoltaic systems based on the weather forecast and hence helps planning energy production and consumption. Two models will be developed in this paper; a Clear Sky Irradiance model and a Cloud-Cover Radiation model to predict the irradiance for a clear sky day and a cloudy day, respectively. The adopted procedure for developing such models takes into consideration two levels of abstraction. First, irradiance and weather data were acquired by a sensory (measurement) system installed on the rooftop of the Information Technology College building at Birzeit University campus. Second, power readings of a fully operational 51kW commercial Photovoltaic system installed in the University at the rooftop of the adjacent College of Pharmacy-Nursing and Health Professions building are used to validate the output of a simulation model and to help refine its structure. Based on a comparison between a mathematical model, which calculates Clear Sky Irradiance for the University location and two sets of accumulated measured data, it is found that the simulation system offers an accurate resemblance to the installed PV power station on clear sky days. However, these comparisons show a divergence between the expected energy yield and actual energy yield in extreme weather conditions, including clouding and soiling effects. Therefore, a more accurate prediction model for irradiance that takes into consideration weather factors, such as relative humidity and cloudiness, which affect irradiance, was developed; Cloud-Cover Radiation Model (CRM). The equivalent mathematical formulas implement corrections to provide more accurate inputs to the simulation system. The results of the CRM show a very good match with the actual measured irradiance during a cloudy day. The developed Photovoltaic profile helps in predicting the output energy yield of the Photovoltaic system installed at the University campus based on the predicted weather conditions. The simulation and practical results for both models are in a very good match.Keywords: clear-sky irradiance model, cloud-cover radiation model, photovoltaic, weather forecast
Procedia PDF Downloads 133884 Durham Region: How to Achieve Zero Waste in a Municipal Setting
Authors: Mirka Januszkiewicz
Abstract:
The Regional Municipality of Durham is the upper level of a two-tier municipal and regional structure comprised of eight lower-tier municipalities. With a population of 655,000 in both urban and rural settings, the Region is approximately 2,537 square kilometers neighboring the City of Toronto, Ontario Canada to the east. The Region has been focused on diverting waste from disposal since the development of its Long Term Waste Management Strategy Plan for 2000-2020. With a 54 percent solid waste diversion rate, the focus now is on achieving 70 percent diversion on the path to zero waste using local waste management options whenever feasible. The Region has an Integrated Waste Management System that consists of a weekly curbside collection of recyclable printed paper and packaging and source separated organics; a seasonal collection of leaf and yard waste; a bi-weekly collection of residual garbage; and twice annual collection of intact, sealed household batteries. The Region also maintains three Waste Management Facilities for residential drop-off of household hazardous waste, polystyrene, construction and demolition debris and electronics. Special collection events are scheduled in the spring, summer and fall months for reusable items, household hazardous waste, and electronics. The Region is in the final commissioning stages of an energy from the waste facility for residual waste disposal that will recover energy from non-recyclable wastes. This facility is state of the art and is equipped for installation of carbon capture technology in the future. Despite all of these diversion programs and efforts, there is still room for improvement. Recent residential waste studies revealed that over 50% of the residual waste placed at the curb that is destined for incineration could be recycled. To move towards a zero waste community, the Region is looking to more advanced technologies for extracting the maximum recycling value from residential waste. Plans are underway to develop a pre-sort facility to remove organics and recyclables from the residual waste stream, including the growing multi-residential sector. Organics would then be treated anaerobically to generate biogas and fertilizer products for beneficial use within the Region. This project could increase the Region’s diversion rate beyond 70 percent and enhance the Region’s climate change mitigation goals. Zero waste is an ambitious goal in a changing regulatory and economic environment. Decision makers must be willing to consider new and emerging technologies and embrace change to succeed.Keywords: municipal waste, residential, waste diversion, zero waste
Procedia PDF Downloads 219883 Multiple Intelligences as Basis for Differentiated Classroom Instruction in Technology Livelihood Education: An Impact Analysis
Authors: Sheila S. Silang
Abstract:
This research seeks to make an impact analysis on multiple intelligence as the basis for differentiated classroom instruction in TLE. It will also address the felt need of how TLE subject could be taught effectively exhausting all the possible means.This study seek the effect of giving different instruction according to the ability of the students in the following objectives: 1. student’s technological skills enhancement, 2. learning potential improvements 3. having better linkage between school and community in a need for soliciting different learning devices and materials for the learner’s academic progress. General Luna, Quezon is composed of twenty seven barangays. There are only two public high schools. We are aware that K-12 curriculum is focused on providing sufficient time for mastery of concepts and skills, develop lifelong learners, and prepare graduates for tertiary education, middle-level skills development, employment, and entrepreneurship. The challenge is with TLE offerring a vast area of specializations, how would Multiple Intelligence play its vital role as basis in classroom instruction in acquiring the requirement of the said curriculum? 1.To what extent do the respondent students manifest the following types of intelligences: Visual-Spatial, Body-Kinesthetic, Musical, Interpersonal, Intrapersonal, Verbal-Linguistic, Logical-Mathematical and Naturalistic. What media should be used appropriate to the student’s learning style? Visual, Printed Words, Sound, Motion, Color or Realia 3. What is the impact of multiple intelligence as basis for differentiated instruction in T.L.E. based on the following student’s ability? Learning Characteristic and Reading Ability and Performance 3. To what extent do the intelligences of the student relate with their academic performance? The following were the findings derived from the study: In consideration of the vast areas of study of TLE, and the importance it plays in the school curriculum coinciding with the expectation of turning students to technologically competent contributing members of the society, either in the field of Technical/Vocational Expertise or Entrepreneurial based competencies, as well as the government’s concern for it, we visualize TLE classroom teachers making use of multiple intelligence as basis for differentiated classroom instruction in teaching the subject .Somehow, multiple intelligence sample such as Linguistic, Logical-Mathematical, Bodily-Kinesthetic, Interpersonal, Intrapersonal, and Spatial abilities that an individual student may have or may not have, can be a basis for a TLE teacher’s instructional method or design.Keywords: education, multiple, differentiated classroom instruction, impact analysis
Procedia PDF Downloads 446882 Automated Transformation of 3D Point Cloud to BIM Model: Leveraging Algorithmic Modeling for Efficient Reconstruction
Authors: Radul Shishkov, Orlin Davchev
Abstract:
The digital era has revolutionized architectural practices, with building information modeling (BIM) emerging as a pivotal tool for architects, engineers, and construction professionals. However, the transition from traditional methods to BIM-centric approaches poses significant challenges, particularly in the context of existing structures. This research introduces a technical approach to bridge this gap through the development of algorithms that facilitate the automated transformation of 3D point cloud data into detailed BIM models. The core of this research lies in the application of algorithmic modeling and computational design methods to interpret and reconstruct point cloud data -a collection of data points in space, typically produced by 3D scanners- into comprehensive BIM models. This process involves complex stages of data cleaning, feature extraction, and geometric reconstruction, which are traditionally time-consuming and prone to human error. By automating these stages, our approach significantly enhances the efficiency and accuracy of creating BIM models for existing buildings. The proposed algorithms are designed to identify key architectural elements within point clouds, such as walls, windows, doors, and other structural components, and to translate these elements into their corresponding BIM representations. This includes the integration of parametric modeling techniques to ensure that the generated BIM models are not only geometrically accurate but also embedded with essential architectural and structural information. Our methodology has been tested on several real-world case studies, demonstrating its capability to handle diverse architectural styles and complexities. The results showcase a substantial reduction in time and resources required for BIM model generation while maintaining high levels of accuracy and detail. This research contributes significantly to the field of architectural technology by providing a scalable and efficient solution for the integration of existing structures into the BIM framework. It paves the way for more seamless and integrated workflows in renovation and heritage conservation projects, where the accuracy of existing conditions plays a critical role. The implications of this study extend beyond architectural practices, offering potential benefits in urban planning, facility management, and historic preservation.Keywords: BIM, 3D point cloud, algorithmic modeling, computational design, architectural reconstruction
Procedia PDF Downloads 66881 Revealing the Sustainable Development Mechanism of Guilin Tourism Based on Driving Force/Pressure/State/Impact/Response Framework
Authors: Xiujing Chen, Thammananya Sakcharoen, Wilailuk Niyommaneerat
Abstract:
China's tourism industry is in a state of shock and recovery, although COVID-19 has brought great impact and challenges to the tourism industry. The theory of sustainable development originates from the contradiction of increasing awareness of environmental protection and the pursuit of economic interests. The sustainable development of tourism should consider social, economic, and environmental factors and develop tourism in a planned and targeted way from the overall situation. Guilin is one of the popular tourist cities in China. However, there exist several problems in Guilin tourism, such as low quality of scenic spot construction and low efficiency of tourism resource development. Due to its unwell-managed, Guilin's tourism industry is facing problems such as supply and demand crowding pressure for tourists. According to the data from 2009 to 2019, there is a change in the degree of sustainable development of Guilin tourism. This research aimed to evaluate the sustainable development state of Guilin tourism using the DPSIR (driving force/pressure/state/impact/response) framework and to provide suggestions and recommendations for sustainable development in Guilin. An improved TOPSIS (technology for order preference by similarity to an ideal solution) model based on the entropy weights relationship is applied to the quantitative analysis and to analyze the mechanisms of sustainable development of tourism in Guilin. The DPSIR framework organizes indicators into sub-five categories: of which twenty-eight indicators related to sustainable aspects of Guilin tourism are classified. The study analyzed and summarized the economic, social, and ecological effects generated by tourism development in Guilin from 2009-2019. The results show that the conversion rate of tourism development in Guilin into regional economic benefits is more efficient than that into social benefits. Thus, tourism development is an important driving force of Guilin's economic growth. In addition, the study also analyzed the static weights of 28 relevant indicators of sustainable development of tourism in Guilin and ranked them from largest to smallest. Then it was found that the economic and social factors related to tourism revenue occupy the highest weight, which means that the economic and social development of Guilin can influence the sustainable development of Guilin tourism to a greater extent. Therefore, there is a two-way causal relationship between tourism development and economic growth in Guilin. At the same time, ecological development-related indicators also have relatively large weights, so ecological and environmental resources also have a great influence on the sustainable development of Guilin tourism.Keywords: DPSIR framework, entropy weights analysis, sustainable development of tourism, TOPSIS analysis
Procedia PDF Downloads 101880 The Effect of Using Universal Design for Learning to Improve the Quality of Vocational Programme with Intellectual Disabilities and the Challenges Facing This Method from the Teachers' Point of View
Authors: Ohud Adnan Saffar
Abstract:
This study aims to know the effect of using universal design for learning (UDL) to improve the quality of vocational programme with intellectual disabilities (SID) and the challenges facing this method from the teachers' point of view. The significance of the study: There are comparatively few published studies on UDL in emerging nations. Therefore, this study will encourage the researchers to consider a new approaches teaching. Development of this study will contribute significant information on the cognitively disabled community on a universal scope. In order to collect and evaluate the data and for the verification of the results, this study has been used the mixed research method, by using two groups comparison method. To answer the study questions, we were used the questionnaire, lists of observations, open questions, and pre and post-test. Thus, the study explored the advantages and drawbacks, and know about the impact of using the UDL method on integrating SID with students non-special education needs in the same classroom. Those aims were realized by developing a workshop to explain the three principles of the UDL and train (16) teachers in how to apply this method to teach (12) students non-special education needs and the (12) SID in the same classroom, then take their opinion by using the questionnaire and questions. Finally, this research will explore the effects of the UDL on the teaching of professional photography skills for the SID in Saudi Arabia. To achieve this goal, the research method was a comparison of the performance of the SID using the UDL method with that of female students with the same challenges applying other strategies by teachers in control and experiment groups, we used the observation lists, pre and post-test. Initial results: It is clear from the previous response to the participants that most of the answers confirmed that the use of UDL achieves the principle of inclusion between the SID and students non-special education needs by 93.8%. In addition, the results show that the majority of the sampled people see that the most important advantages of using UDL in teaching are creating an interactive environment with using new and various teaching methods, with a percentage of 56.2%. Following this result, the UDL is useful for integrating students with general education, with a percentage of 31.2%. Moreover, the finding indicates to improve understanding through using the new technology and exchanging the primitive ways of teaching with the new ones, with a percentage of 25%. The result shows the percentages of the sampled people's opinions about the financial obstacles, and it concluded that the majority see that the cost is high and there is no computer maintenance available, with 50%. There are no smart devices in schools to help in implementing and applying for the program, with a percentage of 43.8%.Keywords: universal design for learning, intellectual disabilities, vocational programme, the challenges facing this method
Procedia PDF Downloads 129879 Invasive Asian Carp Fish Species: A Natural and Sustainable Source of Methionine for Organic Poultry Production
Authors: Komala Arsi, Ann M. Donoghue, Dan J. Donoghue
Abstract:
Methionine is an essential dietary amino acid necessary to promote growth and health of poultry. Synthetic methionine is commonly used as a supplement in conventional poultry diets and is temporarily allowed in organic poultry feed for lack of natural and organically approved sources of methionine. It has been a challenge to find a natural, sustainable and cost-effective source for methionine which reiterates the pressing need to explore potential alternatives of methionine for organic poultry production. Fish have high concentrations of methionine, but wild-caught fish are expensive and adversely impact wild fish populations. Asian carp (AC) is an invasive species and its utilization has the potential to be used as a natural methionine source. However, to our best knowledge, there is no proven technology to utilize this fish as a methionine source. In this study, we co-extruded Asian carp and soybean meal to form a dry-extruded, methionine-rich AC meal. In order to formulate rations with the novel extruded carp meal, the product was tested on cecectomized roosters for its amino acid digestibility and total metabolizable energy (TMEn). Excreta was collected and the gross energy, protein content of the feces was determined to calculate Total Metabolizable Energy (TME). The methionine content, digestibility and TME values were greater for the extruded AC meal than control diets. Carp meal was subsequently tested as a methionine source in feeds formulated for broilers, and production performance (body weight gain and feed conversion ratio) was assessed in comparison with broilers fed standard commercial diets supplemented with synthetic methionine. In this study, broiler chickens were fed either a control diet with synthetic methionine or a treatment diet with extruded AC meal (8 replicates/treatment; n=30 birds/replicate) from day 1 to 42 days of age. At the end of the trial, data for body weights, feed intake and feed conversion ratio (FCR) was analyzed using one-way ANOVA with Fisher LSD test for multiple comparisons. Results revealed that birds on AC diet had body weight gains and feed intake comparable to diets containing synthetic methionine (P > 0.05). Results from the study suggest that invasive AC-derived fish meal could potentially be an effective and inexpensive source of sustainable natural methionine for organic poultry farmers.Keywords: Asian carp, methionine, organic, poultry
Procedia PDF Downloads 158878 Detection, Isolation, and Raman Spectroscopic Characterization of Acute and Chronic Staphylococcus aureus Infection in an Endothelial Cell Culture Model
Authors: Astrid Tannert, Anuradha Ramoji, Christina Ebert, Frederike Gladigau, Lorena Tuchscherr, Jürgen Popp, Ute Neugebauer
Abstract:
Staphylococcus aureus is a facultative intracellular pathogen, which by entering host cells may evade immunologic host response as well as antimicrobial treatment. In that way, S. aureus can cause persistent intracellular infections which are difficult to treat. Depending on the strain, S. aureus may persist at different intracellular locations like the phagolysosome. The first barrier invading pathogens from the blood stream that they have to cross are the endothelial cells lining the inner surface of blood and lymphatic vessels. Upon proceeding from an acute to a chronic infection, intracellular pathogens undergo certain biochemical and structural changes including a deceleration of metabolic processes to adopt for long-term intracellular survival and the development of a special phenotype designated as small colony variant. In this study, the endothelial cell line Ea.hy 926 was used as a model for acute and chronic S. aureus infection. To this end, Ea.hy 926 cells were cultured on QIAscout™ Microraft Arrays, a special graded cell culture substrate that contains around 12,000 microrafts of 200 µm edge length. After attachment to the substrate, the endothelial cells were infected with GFP-expressing S. aureus for 3 weeks. The acute infection and the development of persistent bacteria was followed by confocal laser scanning microscopy, scanning the whole Microraft Array for the presence and for detailed determination of the intracellular location of fluorescent intracellular bacteria every second day. After three weeks of infection representative microrafts containing infected cells, cells with protruded infections and cells that did never show any infection were isolated and fixed for Raman micro-spectroscopic investigation. For comparison, also microrafts with acute infection were isolated. The acquired Raman spectra are correlated with the fluorescence microscopic images to give hints about a) the molecular alterations in endothelial cells during acute and chronic infection compared to non-infected cells, and b) metabolic and structural changes within the pathogen when entering a mode of persistence within host cells. We thank Dr. Ruth Kläver from QIAGEN GmbH for her support regarding QIAscout technology. Financial support by the BMBF via the CSCC (FKZ 01EO1502) and from the DFG via the Jena Biophotonic and Imaging Laboratory (JBIL, FKZ PO 633/29-1, BA 1601/10-1) is highly acknowledged.Keywords: correlative image analysis, intracellular infection, pathogen-host adaption, Raman micro-spectroscopy
Procedia PDF Downloads 181877 Revolutionizing Healthcare Facility Maintenance: A Groundbreaking AI, BIM, and IoT Integration Framework
Authors: Mina Sadat Orooje, Mohammad Mehdi Latifi, Behnam Fereydooni Eftekhari
Abstract:
The integration of cutting-edge Internet of Things (IoT) technologies with advanced Artificial Intelligence (AI) systems is revolutionizing healthcare facility management. However, the current landscape of hospital building maintenance suffers from slow, repetitive, and disjointed processes, leading to significant financial, resource, and time losses. Additionally, the potential of Building Information Modeling (BIM) in facility maintenance is hindered by a lack of data within digital models of built environments, necessitating a more streamlined data collection process. This paper presents a robust framework that harmonizes AI with BIM-IoT technology to elevate healthcare Facility Maintenance Management (FMM) and address these pressing challenges. The methodology begins with a thorough literature review and requirements analysis, providing insights into existing technological landscapes and associated obstacles. Extensive data collection and analysis efforts follow to deepen understanding of hospital infrastructure and maintenance records. Critical AI algorithms are identified to address predictive maintenance, anomaly detection, and optimization needs alongside integration strategies for BIM and IoT technologies, enabling real-time data collection and analysis. The framework outlines protocols for data processing, analysis, and decision-making. A prototype implementation is executed to showcase the framework's functionality, followed by a rigorous validation process to evaluate its efficacy and gather user feedback. Refinement and optimization steps are then undertaken based on evaluation outcomes. Emphasis is placed on the scalability of the framework in real-world scenarios and its potential applications across diverse healthcare facility contexts. Finally, the findings are meticulously documented and shared within the healthcare and facility management communities. This framework aims to significantly boost maintenance efficiency, cut costs, provide decision support, enable real-time monitoring, offer data-driven insights, and ultimately enhance patient safety and satisfaction. By tackling current challenges in healthcare facility maintenance management it paves the way for the adoption of smarter and more efficient maintenance practices in healthcare facilities.Keywords: artificial intelligence, building information modeling, healthcare facility maintenance, internet of things integration, maintenance efficiency
Procedia PDF Downloads 61876 The Effect of Applying the Electronic Supply System on the Performance of the Supply Chain in Health Organizations
Authors: Sameh S. Namnqani, Yaqoob Y. Abobakar, Ahmed M. Alsewehri, Khaled M. AlQethami
Abstract:
The main objective of this research is to know the impact of the application of the electronic supply system on the performance of the supply department of health organizations. To reach this goal, the study adopted independent variables to measure the dependent variable (performance of the supply department), namely: integration with suppliers, integration with intermediaries and distributors and knowledge of supply size, inventory, and demand. The study used the descriptive method and was aided by the questionnaire tool that was distributed to a sample of workers in the Supply Chain Management Department of King Abdullah Medical City. After the statistical analysis, the results showed that: The 70 sample members strongly agree with the (electronic integration with suppliers) axis with a p-value of 0.001, especially with regard to the following: Opening formal and informal communication channels between management and suppliers (Mean 4.59) and exchanging information with suppliers with transparency and clarity (Mean 4.50). It also clarified that the sample members agree on the axis of (electronic integration with brokers and distributors) with a p-value of 0.001 and this is represented in the following elements: Exchange of information between management, brokers and distributors with transparency, clarity (Mean 4.18) , and finding a close cooperation relationship between management, brokers and distributors (Mean 4.13). The results also indicated that the respondents agreed to some extent on the axis (knowledge of the size of supply, stock, and demand) with a p-value of 0.001. It also indicated that the respondents strongly agree with the existence of a relationship between electronic procurement and (the performance of the procurement department in health organizations) with a p-value of 0.001, which is represented in the following: transparency and clarity in dealing with suppliers and intermediaries to prevent fraud and manipulation (Mean 4.50) and reduce the costs of supplying the needs of the health organization (Mean 4.50). From the results, the study recommended several recommendations, the most important of which are: that health organizations work to increase the level of information sharing between them and suppliers in order to achieve the implementation of electronic procurement in the supply management of health organizations. Attention to using electronic data interchange methods and using modern programs that make supply management able to exchange information with brokers and distributors to find out the volume of supply, inventory, and demand. To know the volume of supply, inventory, and demand, it recommended the application of scientific methods of supply for storage. Take advantage of information technology, for example, electronic data exchange techniques and documents, where it can help in contact with suppliers, brokers, and distributors, and know the volume of supply, inventory, and demand, which contributes to improving the performance of the supply department in health organizations.Keywords: healthcare supply chain, performance, electronic system, ERP
Procedia PDF Downloads 136875 A Methodology to Virtualize Technical Engineering Laboratories: MastrLAB-VR
Authors: Ivana Scidà, Francesco Alotto, Anna Osello
Abstract:
Due to the importance given today to innovation, the education sector is evolving thanks digital technologies. Virtual Reality (VR) can be a potential teaching tool offering many advantages in the field of training and education, as it allows to acquire theoretical knowledge and practical skills using an immersive experience in less time than the traditional educational process. These assumptions allow to lay the foundations for a new educational environment, involving and stimulating for students. Starting from the objective of strengthening the innovative teaching offer and the learning processes, the case study of the research concerns the digitalization of MastrLAB, High Quality Laboratory (HQL) belonging to the Department of Structural, Building and Geotechnical Engineering (DISEG) of the Polytechnic of Turin, a center specialized in experimental mechanical tests on traditional and innovative building materials and on the structures made with them. The MastrLAB-VR has been developed, a revolutionary innovative training tool designed with the aim of educating the class in total safety on the techniques of use of machinery, thus reducing the dangers arising from the performance of potentially dangerous activities. The virtual laboratory, dedicated to the students of the Building and Civil Engineering Courses of the Polytechnic of Turin, has been projected to simulate in an absolutely realistic way the experimental approach to the structural tests foreseen in their courses of study: from the tensile tests to the relaxation tests, from the steel qualification tests to the resilience tests on elements at environmental conditions or at characterizing temperatures. The research work proposes a methodology for the virtualization of technical laboratories through the application of Building Information Modelling (BIM), starting from the creation of a digital model. The process includes the creation of an independent application, which with Oculus Rift technology will allow the user to explore the environment and interact with objects through the use of joypads. The application has been tested in prototype way on volunteers, obtaining results related to the acquisition of the educational notions exposed in the experience through a virtual quiz with multiple answers, achieving an overall evaluation report. The results have shown that MastrLAB-VR is suitable for both beginners and experts and will be adopted experimentally for other laboratories of the University departments.Keywords: building information modelling, digital learning, education, virtual laboratory, virtual reality
Procedia PDF Downloads 131874 Meeting the Energy Balancing Needs in a Fully Renewable European Energy System: A Stochastic Portfolio Framework
Authors: Iulia E. Falcan
Abstract:
The transition of the European power sector towards a clean, renewable energy (RE) system faces the challenge of meeting power demand in times of low wind speed and low solar radiation, at a reasonable cost. This is likely to be achieved through a combination of 1) energy storage technologies, 2) development of the cross-border power grid, 3) installed overcapacity of RE and 4) dispatchable power sources – such as biomass. This paper uses NASA; derived hourly data on weather patterns of sixteen European countries for the past twenty-five years, and load data from the European Network of Transmission System Operators-Electricity (ENTSO-E), to develop a stochastic optimization model. This model aims to understand the synergies between the four classes of technologies mentioned above and to determine the optimal configuration of the energy technologies portfolio. While this issue has been addressed before, it was done so using deterministic models that extrapolated historic data on weather patterns and power demand, as well as ignoring the risk of an unbalanced grid-risk stemming from both the supply and the demand side. This paper aims to explicitly account for the inherent uncertainty in the energy system transition. It articulates two levels of uncertainty: a) the inherent uncertainty in future weather patterns and b) the uncertainty of fully meeting power demand. The first level of uncertainty is addressed by developing probability distributions for future weather data and thus expected power output from RE technologies, rather than known future power output. The latter level of uncertainty is operationalized by introducing a Conditional Value at Risk (CVaR) constraint in the portfolio optimization problem. By setting the risk threshold at different levels – 1%, 5% and 10%, important insights are revealed regarding the synergies of the different energy technologies, i.e., the circumstances under which they behave as either complements or substitutes to each other. The paper concludes that allowing for uncertainty in expected power output - rather than extrapolating historic data - paints a more realistic picture and reveals important departures from results of deterministic models. In addition, explicitly acknowledging the risk of an unbalanced grid - and assigning it different thresholds - reveals non-linearity in the cost functions of different technology portfolio configurations. This finding has significant implications for the design of the European energy mix.Keywords: cross-border grid extension, energy storage technologies, energy system transition, stochastic portfolio optimization
Procedia PDF Downloads 171873 Mechanical Characterization and CNC Rotary Ultrasonic Grinding of Crystal Glass
Authors: Ricardo Torcato, Helder Morais
Abstract:
The manufacture of crystal glass parts is based on obtaining the rough geometry by blowing and/or injection, generally followed by a set of manual finishing operations using cutting and grinding tools. The forming techniques used do not allow the obtainment, with repeatability, of parts with complex shapes and the finishing operations use intensive specialized labor resulting in high cycle times and production costs. This work aims to explore the digital manufacture of crystal glass parts by investigating new subtractive techniques for the automated, flexible finishing of these parts. Finishing operations are essential to respond to customer demands in terms of crystal feel and shine. It is intended to investigate the applicability of different computerized finishing technologies, namely milling and grinding in a CNC machining center with or without ultrasonic assistance, to crystal processing. Research in the field of grinding hard and brittle materials, despite not being extensive, has increased in recent years, and scientific knowledge about the machinability of crystal glass is still very limited. However, it can be said that the unique properties of glass, such as high hardness and very low toughness, make any glass machining technology a very challenging process. This work will measure the performance improvement brought about by the use of ultrasound compared to conventional crystal grinding. This presentation is focused on the mechanical characterization and analysis of the cutting forces in CNC machining of superior crystal glass (Pb ≥ 30%). For the mechanical characterization, the Vickers hardness test provides an estimate of the material hardness (Hv) and the fracture toughness based on cracks that appear in the indentation. Mechanical impulse excitation test estimates the Young’s Modulus, shear modulus and Poisson ratio of the material. For the cutting forces, it a dynamometer was used to measure the forces in the face grinding process. The tests were made based on the Taguchi method to correlate the input parameters (feed rate, tool rotation speed and depth of cut) with the output parameters (surface roughness and cutting forces) to optimize the process (better roughness using the cutting forces that do not compromise the material structure and the tool life) using ANOVA. This study was conducted for conventional grinding and for the ultrasonic grinding process with the same cutting tools. It was possible to determine the optimum cutting parameters for minimum cutting forces and for minimum surface roughness in both grinding processes. Ultrasonic-assisted grinding provides a better surface roughness than conventional grinding.Keywords: CNC machining, crystal glass, cutting forces, hardness
Procedia PDF Downloads 155872 Control for Fluid Flow Behaviours of Viscous Fluids and Heat Transfer in Mini-Channel: A Case Study Using Numerical Simulation Method
Authors: Emmanuel Ophel Gilbert, Williams Speret
Abstract:
The control for fluid flow behaviours of viscous fluids and heat transfer occurrences within heated mini-channel is considered. Heat transfer and flow characteristics of different viscous liquids, such as engine oil, automatic transmission fluid, one-half ethylene glycol, and deionized water were numerically analyzed. Some mathematical applications such as Fourier series and Laplace Z-Transforms were employed to ascertain the behaviour-wave like structure of these each viscous fluids. The steady, laminar flow and heat transfer equations are reckoned by the aid of numerical simulation technique. Further, this numerical simulation technique is endorsed by using the accessible practical values in comparison with the anticipated local thermal resistances. However, the roughness of this mini-channel that is one of the physical limitations was also predicted in this study. This affects the frictional factor. When an additive such as tetracycline was introduced in the fluid, the heat input was lowered, and this caused pro rata effect on the minor and major frictional losses, mostly at a very minute Reynolds number circa 60-80. At this ascertained lower value of Reynolds numbers, there exists decrease in the viscosity and minute frictional losses as a result of the temperature of these viscous liquids been increased. It is inferred that the three equations and models are identified which supported the numerical simulation via interpolation and integration of the variables extended to the walls of the mini-channel, yields the utmost reliance for engineering and technology calculations for turbulence impacting jets in the near imminent age. Out of reasoning with a true equation that could support this control for the fluid flow, Navier-stokes equations were found to tangential to this finding. Though, other physical factors with respect to these Navier-stokes equations are required to be checkmated to avoid uncertain turbulence of the fluid flow. This paradox is resolved within the framework of continuum mechanics using the classical slip condition and an iteration scheme via numerical simulation method that takes into account certain terms in the full Navier-Stokes equations. However, this resulted in dropping out in the approximation of certain assumptions. Concrete questions raised in the main body of the work are sightseen further in the appendices.Keywords: frictional losses, heat transfer, laminar flow, mini-channel, number simulation, Reynolds number, turbulence, viscous fluids
Procedia PDF Downloads 177871 The Importance of Efficient and Sustainable Water Resources Management and the Role of Artificial Intelligence in Preventing Forced Migration
Authors: Fateme Aysin Anka, Farzad Kiani
Abstract:
Forced migration is a situation in which people are forced to leave their homes against their will due to political conflicts, wars and conflicts, natural disasters, climate change, economic crises, or other emergencies. This type of migration takes place under conditions where people cannot lead a sustainable life due to reasons such as security, shelter and meeting their basic needs. This type of migration may occur in connection with different factors that affect people's living conditions. In addition to these general and widespread reasons, water security and resources will be one that is starting now and will be encountered more and more in the future. Forced migration may occur due to insufficient or depleted water resources in the areas where people live. In this case, people's living conditions become unsustainable, and they may have to go elsewhere, as they cannot obtain their basic needs, such as drinking water, water used for agriculture and industry. To cope with these situations, it is important to minimize the causes, as international organizations and societies must provide assistance (for example, humanitarian aid, shelter, medical support and education) and protection to address (or mitigate) this problem. From the international perspective, plans such as the Green New Deal (GND) and the European Green Deal (EGD) draw attention to the need for people to live equally in a cleaner and greener world. Especially recently, with the advancement of technology, science and methods have become more efficient. In this regard, in this article, a multidisciplinary case model is presented by reinforcing the water problem with an engineering approach within the framework of the social dimension. It is worth emphasizing that this problem is largely linked to climate change and the lack of a sustainable water management perspective. As a matter of fact, the United Nations Development Agency (UNDA) draws attention to this problem in its universally accepted sustainable development goals. Therefore, an artificial intelligence-based approach has been applied to solve this problem by focusing on the water management problem. The most general but also important aspect in the management of water resources is its correct consumption. In this context, the artificial intelligence-based system undertakes tasks such as water demand forecasting and distribution management, emergency and crisis management, water pollution detection and prevention, and maintenance and repair control and forecasting.Keywords: water resource management, forced migration, multidisciplinary studies, artificial intelligence
Procedia PDF Downloads 87870 Virtual Reality in COVID-19 Stroke Rehabilitation: Preliminary Outcomes
Authors: Kasra Afsahi, Maryam Soheilifar, S. Hossein Hosseini
Abstract:
Background: There is growing evidence that Cerebral Vascular Accident (CVA) can be a consequence of Covid-19 infection. Understanding novel treatment approaches are important in optimizing patient outcomes. Case: This case explores the use of Virtual Reality (VR) in the treatment of a 23-year-old COVID-positive female presenting with left hemiparesis in August 2020. Imaging showed right globus pallidus, thalamus, and internal capsule ischemic stroke. Conventional rehabilitation was started two weeks later, with virtual reality (VR) included. This game-based virtual reality (VR) technology developed for stroke patients was based on upper extremity exercises and functions for stroke. Physical examination showed left hemiparesis with muscle strength 3/5 in the upper extremity and 4/5 in the lower extremity. The range of motion of the shoulder was 90-100 degrees. The speech exam showed a mild decrease in fluency. Mild lower lip dynamic asymmetry was seen. Babinski was positive on the left. Gait speed was decreased (75 steps per minute). Intervention: Our game-based VR system was developed based on upper extremity physiotherapy exercises for post-stroke patients to increase the active, voluntary movement of the upper extremity joints and improve the function. The conventional program was initiated with active exercises, shoulder sanding for joint ROMs, walking shoulder, shoulder wheel, and combination movements of the shoulder, elbow, and wrist joints, alternative flexion-extension, pronation-supination movements, Pegboard and Purdo pegboard exercises. Also, fine movements included smart gloves, biofeedback, finger ladder, and writing. The difficulty of the game increased at each stage of the practice with progress in patient performances. Outcome: After 6 weeks of treatment, gait and speech were normal and upper extremity strength was improved to near normal status. No adverse effects were noted. Conclusion: This case suggests that VR is a useful tool in the treatment of a patient with covid-19 related CVA. The safety of newly developed instruments for such cases provides new approaches to improve the therapeutic outcomes and prognosis as well as increased satisfaction rate among patients.Keywords: covid-19, stroke, virtual reality, rehabilitation
Procedia PDF Downloads 143869 The Ideal Memory Substitute for Computer Memory Hierarchy
Authors: Kayode A. Olaniyi, Olabanji F. Omotoye, Adeola A. Ogunleye
Abstract:
Computer system components such as the CPU, the Controllers, and the operating system, work together as a team, and storage or memory is the essential parts of this team apart from the processor. The memory and storage system including processor caches, main memory, and storage, form basic storage component of a computer system. The characteristics of the different types of storage are inherent in the design and the technology employed in the manufacturing. These memory characteristics define the speed, compatibility, cost, volatility, and density of the various storage types. Most computers rely on a hierarchy of storage devices for performance. The effective and efficient use of the memory hierarchy of the computer system therefore is the single most important aspect of computer system design and use. The memory hierarchy is becoming a fundamental performance and energy bottleneck, due to the widening gap between the increasing demands of modern computer applications and the limited performance and energy efficiency provided by traditional memory technologies. With the dramatic development in the computers systems, computer storage has had a difficult time keeping up with the processor speed. Computer architects are therefore facing constant challenges in developing high-speed computer storage with high-performance which is energy-efficient, cost-effective and reliable, to intercept processor requests. It is very clear that substantial advancements in redesigning the existing memory physical and logical structures to meet up with the latest processor potential is crucial. This research work investigates the importance of computer memory (storage) hierarchy in the design of computer systems. The constituent storage types of the hierarchy today were investigated looking at the design technologies and how the technologies affect memory characteristics: speed, density, stability and cost. The investigation considered how these characteristics could best be harnessed for overall efficiency of the computer system. The research revealed that the best single type of storage, which we refer to as ideal memory is that logical single physical memory which would combine the best attributes of each memory type that make up the memory hierarchy. It is a single memory with access speed as high as one found in CPU registers, combined with the highest storage capacity, offering excellent stability in the presence or absence of power as found in the magnetic and optical disks as against volatile DRAM, and yet offers a cost-effective attribute that is far away from the expensive SRAM. The research work suggests that to overcome these barriers it may then mean that memory manufacturing will take a total deviation from the present technologies and adopt one that overcomes the associated challenges with the traditional memory technologies.Keywords: cache, memory-hierarchy, memory, registers, storage
Procedia PDF Downloads 167868 Development of a Test Plant for Parabolic Trough Solar Collectors Characterization
Authors: Nelson Ponce Jr., Jonas R. Gazoli, Alessandro Sete, Roberto M. G. Velásquez, Valério L. Borges, Moacir A. S. de Andrade
Abstract:
The search for increased efficiency in generation systems has been of great importance in recent years to reduce the impact of greenhouse gas emissions and global warming. For clean energy sources, such as the generation systems that use concentrated solar power technology, this efficiency improvement impacts a lower investment per kW, improving the project’s viability. For the specific case of parabolic trough solar concentrators, their performance is strongly linked to their geometric precision of assembly and the individual efficiencies of their main components, such as parabolic mirrors and receiver tubes. Thus, for accurate efficiency analysis, it should be conducted empirically, looking for mounting and operating conditions like those observed in the field. The Brazilian power generation and distribution company Eletrobras Furnas, through the R&D program of the National Agency of Electrical Energy, has developed a plant for testing parabolic trough concentrators located in Aparecida de Goiânia, in the state of Goiás, Brazil. The main objective of this test plant is the characterization of the prototype concentrator that is being developed by the company itself in partnership with Eudora Energia, seeking to optimize it to obtain the same or better efficiency than the concentrators of this type already known commercially. This test plant is a closed pipe system where a pump circulates a heat transfer fluid, also calledHTF, in the concentrator that is being characterized. A flow meter and two temperature transmitters, installed at the inlet and outlet of the concentrator, record the parameters necessary to know the power absorbed by the system and then calculate its efficiency based on the direct solar irradiation available during the test period. After the HTF gains heat in the concentrator, it flows through heat exchangers that allow the acquired energy to be dissipated into the ambient. The goal is to keep the concentrator inlet temperature constant throughout the desired test period. The developed plant performs the tests in an autonomous way, where the operator must enter the HTF flow rate in the control system, the desired concentrator inlet temperature, and the test time. This paper presents the methodology employed for design and operation, as well as the instrumentation needed for the development of a parabolic trough test plant, being a guideline for standardization facilities.Keywords: parabolic trough, concentrated solar power, CSP, solar power, test plant, energy efficiency, performance characterization, renewable energy
Procedia PDF Downloads 119867 Variability of the X-Ray Sun during Descending Period of Solar Cycle 23
Authors: Zavkiddin Mirtoshev, Mirabbos Mirkamalov
Abstract:
We have analyzed the time series of full disk integrated soft X-ray (SXR) and hard X-ray (HXR) emission from the solar corona during 2004 January 1 to 2009 December 31, covering the descending phase of solar cycle 23. We employed the daily X-ray index (DXI) derived from X-ray observations from the Solar X-ray Spectrometer (SOXS) mission in four different energy bands: 4-5.5; 5.5-7.5 keV (SXR) and 15-20; 20-25 keV (HXR). The application of Lomb-Scargle periodogram technique to the DXI time series observed by the Silicium detector in the energy bands reveals several short and intermediate periodicities of the X-ray corona. The DXI explicitly show the periods of 13.6 days, 26.7 days, 128.5 days, 151 days, 180 days, 220 days, 270 days, 1.24 year and 1.54 year periods in SXR as well as in HXR energy bands. Although all periods are above 70% confidence level in all energy bands, they show strong power in HXR emission in comparison to SXR emission. These periods are distinctly clear in three bands but somehow not unambiguously clear in 5.5-7.5 keV band. This might be due to the presence of Ferrum and Ferrum/Niccolum line features, which frequently vary with small scale flares like micro-flares. The regular 27-day rotation and 13.5 day period of sunspots from the invisible side of the Sun are found stronger in HXR band relative to SXR band. However, flare activity Rieger periods (150 and 180 days) and near Rieger period 220 days are very strong in HXR emission which is very much expected. On the other hand, our current study reveals strong 270 day periodicity in SXR emission which may be connected with tachocline, similar to a fundamental rotation period of the Sun. The 1.24 year and 1.54 year periodicities, represented from the present research work, are well observable in both SXR as well as in HXR channels. These long-term periodicities must also have connection with tachocline and should be regarded as a consequence of variation in rotational modulation over long time scales. The 1.24 year and 1.54 year periods are also found great importance and significance in the life formation and it evolution on the Earth, and therefore they also have great astro-biological importance. We gratefully acknowledge support by the Indian Centre for Space Science and Technology Education in Asia and the Pacific (CSSTEAP, the Centre is affiliated to the United Nations), Physical Research Laboratory (PRL) at Ahmedabad, India. This work has done under the supervision of Prof. Rajmal Jain and paper consist materials of pilot project and research part of the M. Tech program which was made during Space and Atmospheric Science Course.Keywords: corona, flares, solar activity, X-ray emission
Procedia PDF Downloads 345866 Parallel Fuzzy Rough Support Vector Machine for Data Classification in Cloud Environment
Authors: Arindam Chaudhuri
Abstract:
Classification of data has been actively used for most effective and efficient means of conveying knowledge and information to users. The prima face has always been upon techniques for extracting useful knowledge from data such that returns are maximized. With emergence of huge datasets the existing classification techniques often fail to produce desirable results. The challenge lies in analyzing and understanding characteristics of massive data sets by retrieving useful geometric and statistical patterns. We propose a supervised parallel fuzzy rough support vector machine (PFRSVM) for data classification in cloud environment. The classification is performed by PFRSVM using hyperbolic tangent kernel. The fuzzy rough set model takes care of sensitiveness of noisy samples and handles impreciseness in training samples bringing robustness to results. The membership function is function of center and radius of each class in feature space and is represented with kernel. It plays an important role towards sampling the decision surface. The success of PFRSVM is governed by choosing appropriate parameter values. The training samples are either linear or nonlinear separable. The different input points make unique contributions to decision surface. The algorithm is parallelized with a view to reduce training times. The system is built on support vector machine library using Hadoop implementation of MapReduce. The algorithm is tested on large data sets to check its feasibility and convergence. The performance of classifier is also assessed in terms of number of support vectors. The challenges encountered towards implementing big data classification in machine learning frameworks are also discussed. The experiments are done on the cloud environment available at University of Technology and Management, India. The results are illustrated for Gaussian RBF and Bayesian kernels. The effect of variability in prediction and generalization of PFRSVM is examined with respect to values of parameter C. It effectively resolves outliers’ effects, imbalance and overlapping class problems, normalizes to unseen data and relaxes dependency between features and labels. The average classification accuracy for PFRSVM is better than other classifiers for both Gaussian RBF and Bayesian kernels. The experimental results on both synthetic and real data sets clearly demonstrate the superiority of the proposed technique.Keywords: FRSVM, Hadoop, MapReduce, PFRSVM
Procedia PDF Downloads 491865 Deep Learning for Qualitative and Quantitative Grain Quality Analysis Using Hyperspectral Imaging
Authors: Ole-Christian Galbo Engstrøm, Erik Schou Dreier, Birthe Møller Jespersen, Kim Steenstrup Pedersen
Abstract:
Grain quality analysis is a multi-parameterized problem that includes a variety of qualitative and quantitative parameters such as grain type classification, damage type classification, and nutrient regression. Currently, these parameters require human inspection, a multitude of instruments employing a variety of sensor technologies, and predictive model types or destructive and slow chemical analysis. This paper investigates the feasibility of applying near-infrared hyperspectral imaging (NIR-HSI) to grain quality analysis. For this study two datasets of NIR hyperspectral images in the wavelength range of 900 nm - 1700 nm have been used. Both datasets contain images of sparsely and densely packed grain kernels. The first dataset contains ~87,000 image crops of bulk wheat samples from 63 harvests where protein value has been determined by the FOSS Infratec NOVA which is the golden industry standard for protein content estimation in bulk samples of cereal grain. The second dataset consists of ~28,000 image crops of bulk grain kernels from seven different wheat varieties and a single rye variety. In the first dataset, protein regression analysis is the problem to solve while variety classification analysis is the problem to solve in the second dataset. Deep convolutional neural networks (CNNs) have the potential to utilize spatio-spectral correlations within a hyperspectral image to simultaneously estimate the qualitative and quantitative parameters. CNNs can autonomously derive meaningful representations of the input data reducing the need for advanced preprocessing techniques required for classical chemometric model types such as artificial neural networks (ANNs) and partial least-squares regression (PLS-R). A comparison between different CNN architectures utilizing 2D and 3D convolution is conducted. These results are compared to the performance of ANNs and PLS-R. Additionally, a variety of preprocessing techniques from image analysis and chemometrics are tested. These include centering, scaling, standard normal variate (SNV), Savitzky-Golay (SG) filtering, and detrending. The results indicate that the combination of NIR-HSI and CNNs has the potential to be the foundation for an automatic system unifying qualitative and quantitative grain quality analysis within a single sensor technology and predictive model type.Keywords: deep learning, grain analysis, hyperspectral imaging, preprocessing techniques
Procedia PDF Downloads 100