Search results for: technology complexity
6453 Transfer of Constraints or Constraints on Transfer? Syntactic Islands in Danish L2 English
Authors: Anne Mette Nyvad, Ken Ramshøj Christensen
Abstract:
In the syntax literature, it has standardly been assumed that relative clauses and complement wh-clauses are islands for extraction in English, and that constraints on extraction from syntactic islands are universal. However, the Mainland Scandinavian languages has been known to provide counterexamples. Previous research on Danish has shown that neither relative clauses nor embedded questions are strong islands in Danish. Instead, extraction from this type of syntactic environment is degraded due to structural complexity and it interacts with nonstructural factors such as the frequency of occurrence of the matrix verb, the possibility of temporary misanalysis leading to semantic incongruity and exposure over time. We argue that these facts can be accounted for with parametric variation in the availability of CP-recursion, resulting in the patterns observed, as Danish would then “suspend” the ban on movement out of relative clauses and embedded questions. Given that Danish does not seem to adhere to allegedly universal syntactic constraints, such as the Complex NP Constraint and the Wh-Island Constraint, what happens in L2 English? We present results from a study investigating how native Danish speakers judge extractions from island structures in L2 English. Our findings suggest that Danes transfer their native language parameter setting when asked to judge island constructions in English. This is compatible with the Full Transfer Full Access Hypothesis, as the latter predicts that Danish would have difficulties resetting their [+/- CP-recursion] parameter in English because they are not exposed to negative evidence.Keywords: syntax, islands, second language acquisition, danish
Procedia PDF Downloads 1286452 Information and Communication Technology Application in the Face of COVID-19 Pandemic in Effective Service Delivery in Schools
Authors: Odigie Veronica
Abstract:
The paper focused on the application of Information and Communication Technology (ICT) in effective service delivery in view of the ongoing COVID-19 experience. It adopted the exploratory research method with three research objectives captured. Consequently, the objectives were to ascertain the meaning of online education, understand the concept of COVID-19 and to determine the relevance of online education in effective service delivery in institutions of learning. It is evident from the findings that through ICT, online mode of learning can be adopted in schools which helps greatly in promoting continual education. Online mode of education is practiced online; it brings both the teacher and learners from different places together, without any physical boundary/contact (at least 75%); and has helped greatly in human development in countries where it has been practiced. It is also a welcome development owing to its many benefits such as exposure to digital learning, having access to works of great teachers and educationists such as Socrates, Plato, Dewey, R.S. Peters, J. J. Rosseau, Nnamdi Azikwe, Carol Gilligan, J. I. Omoregbe, Jane Roland Martin, Jean Piaget, among others; and the facilitation of uninterrupted learning for class promotion and graduation of students. Developing the learners all round is part of human development which helps in developing a nation. These and many more are some benefits online education offers which make ICT very relevant in our contemporary societyKeywords: online education, COVID-19 pandemic, effective service delivery, human development
Procedia PDF Downloads 1026451 Proposal of a Rectenna Built by Using Paper as a Dielectric Substrate for Electromagnetic Energy Harvesting
Authors: Ursula D. C. Resende, Yan G. Santos, Lucas M. de O. Andrade
Abstract:
The recent and fast development of the internet, wireless, telecommunication technologies and low-power electronic devices has led to an expressive amount of electromagnetic energy available in the environment and the smart applications technology expansion. These applications have been used in the Internet of Things devices, 4G and 5G solutions. The main feature of this technology is the use of the wireless sensor. Although these sensors are low-power loads, their use imposes huge challenges in terms of an efficient and reliable way for power supply in order to avoid the traditional battery. The radio frequency based energy harvesting technology is especially suitable to wireless power sensors by using a rectenna since it can be completely integrated into the distributed hosting sensors structure, reducing its cost, maintenance and environmental impact. The rectenna is an equipment composed of an antenna and a rectifier circuit. The antenna function is to collect as much radio frequency radiation as possible and transfer it to the rectifier, which is a nonlinear circuit, that converts the very low input radio frequency energy into direct current voltage. In this work, a set of rectennas, mounted on a paper substrate, which can be used for the inner coating of buildings and simultaneously harvest electromagnetic energy from the environment, is proposed. Each proposed individual rectenna is composed of a 2.45 GHz patch antenna and a voltage doubler rectifier circuit, built in the same paper substrate. The antenna contains a rectangular radiator element and a microstrip transmission line that was projected and optimized by using the Computer Simulation Software (CST) in order to obtain values of S11 parameter below -10 dB in 2.45 GHz. In order to increase the amount of harvested power, eight individual rectennas, incorporating metamaterial cells, were connected in parallel forming a system, denominated Electromagnetic Wall (EW). In order to evaluate the EW performance, it was positioned at a variable distance from the internet router, and a 27 kΩ resistive load was fed. The results obtained showed that if more than one rectenna is associated in parallel, enough power level can be achieved in order to feed very low consumption sensors. The 0.12 m2 EW proposed in this work was able to harvest 0.6 mW from the environment. It also observed that the use of metamaterial structures provide an expressive growth in the amount of electromagnetic energy harvested, which was increased from 0. 2mW to 0.6 mW.Keywords: electromagnetic energy harvesting, metamaterial, rectenna, rectifier circuit
Procedia PDF Downloads 1696450 Identifying and Ranking Environmental Risks of Oil and Gas Projects Using the VIKOR Method for Multi-Criteria Decision Making
Authors: Sasan Aryaee, Mahdi Ravanshadnia
Abstract:
Naturally, any activity is associated with risk, and humans have understood this concept from very long times ago and seek to identify its factors and sources. On the one hand, proper risk management can cause problems such as delays and unforeseen costs in the development projects, temporary or permanent loss of services, getting lost or information theft, complexity and limitations in processes, unreliable information caused by rework, holes in the systems and many such problems. In the present study, a model has been presented to rank the environmental risks of oil and gas projects. The statistical population of the study consists of all executives active in the oil and gas fields, that the statistical sample is selected randomly. In the framework of the proposed method, environmental risks of oil and gas projects were first extracted, then a questionnaire based on these indicators was designed based on Likert scale and distributed among the statistical sample. After assessing the validity and reliability of the questionnaire, environmental risks of oil and gas projects were ranked using the VIKOR method of multiple-criteria decision-making. The results showed that the best options for HSE planning of oil and gas projects that caused the reduction of risks and personal injury and casualties and less than other options is costly for the project and it will add less time to the duration of implementing the project is the entering of dye to the environment when painting the generator pond and the presence of the rigger near the crane.Keywords: ranking, multi-criteria decision making, oil and gas projects, HSEmanagement, environmental risks
Procedia PDF Downloads 1596449 Reduce the Impact of Wildfires by Identifying Them Early from Space and Sending Location Directly to Closest First Responders
Authors: Gregory Sullivan
Abstract:
The evolution of global warming has escalated the number and complexity of forest fires around the world. As an example, the United States and Brazil combined generated more than 30,000 forest fires last year. The impact to our environment, structures and individuals is incalculable. The world has learned to try to take this in stride, trying multiple ways to contain fires. Some countries are trying to use cameras in limited areas. There are discussions of using hundreds of low earth orbit satellites and linking them together, and, interfacing them through ground networks. These are all truly noble attempts to defeat the forest fire phenomenon. But there is a better, simpler answer. A bigger piece of the solutions puzzle is to see the fires while they are small, soon after initiation. The approach is to see the fires while they are very small and report their location (latitude and longitude) to local first responders. This is done by placing a sensor at geostationary orbit (GEO: 26,000 miles above the earth). By placing this small satellite in GEO, we can “stare” at the earth, and sense temperature changes. We do not “see” fires, but “measure” temperature changes. This has already been demonstrated on an experimental scale. Fires were seen at close to initiation, and info forwarded to first responders. it were the first to identify the fires 7 out of 8 times. The goal is to have a small independent satellite at GEO orbit focused only on forest fire initiation. Thus, with one small satellite, focused only on forest fire initiation, we hope to greatly decrease the impact to persons, property and the environment.Keywords: space detection, wildfire early warning, demonstration wildfire detection and action from space, space detection to first responders
Procedia PDF Downloads 716448 A Multi-Scale Approach for the Analysis of Fiber-Reinforced Composites
Authors: Azeez Shaik, Amit Salvi, B. P. Gautham
Abstract:
Fiber reinforced polymer resin composite materials are finding wide variety of applications in automotive and aerospace industry because of their high specific stiffness and specific strengths when compared to metals. New class of 2D and 3D textile and woven fabric composites offer excellent fracture toughens as they bridge the cracks formed during fracture. Due to complexity of their fiber architectures and its resulting composite microstructures, optimized design and analysis of these structures is very complicated. A traditional homogenization approach is typically used to analyze structures made up of these materials. This approach usually fails to predict damage initiation as well as damage propagation and ultimate failure of structure made up of woven and textile composites. This study demonstrates a methodology to analyze woven and textile composites by using the multi-level multi-scale modelling approach. In this approach, a geometric repetitive unit cell (RUC) is developed with all its constituents to develop a representative volume element (RVE) with all its constituents and their interaction modeled correctly. The structure is modeled based on the RUC/RVE and analyzed at different length scales with desired levels of fidelity incorporating the damage and failure. The results are passed across (up and down) the scales qualitatively as well as quantitatively from the perspective of material, configuration and architecture.Keywords: cohesive zone, multi-scale modeling, rate dependency, RUC, woven textiles
Procedia PDF Downloads 3636447 The Legal and Regulatory Gaps of Blockchain-Enabled Energy Prosumerism
Authors: Karisma Karisma, Pardis Moslemzadeh Tehrani
Abstract:
This study aims to conduct a high-level strategic dialogue on the lack of consensus, consistency, and legal certainty regarding blockchain-based energy prosumerism so that appropriate institutional and governance structures can be put in place to address the inadequacies and gaps in the legal and regulatory framework. The drive to achieve national and global decarbonization targets is a driving force behind climate goals and policies under the Paris Agreement. In recent years, efforts to ‘demonopolize’ and ‘decentralize’ energy generation and distribution have driven the energy transition toward decentralized systems, invoking concepts such as ownership, sovereignty, and autonomy of RE sources. The emergence of individual and collective forms of prosumerism and the rapid diffusion of blockchain is expected to play a critical role in the decarbonization and democratization of energy systems. However, there is a ‘regulatory void’ relating to individual and collective forms of prosumerism that could prevent the rapid deployment of blockchain systems and potentially stagnate the operationalization of blockchain-enabled energy sharing and trading activities. The application of broad and facile regulatory fixes may be insufficient to address the major regulatory gaps. First, to the authors’ best knowledge, the concepts and elements circumjacent to individual and collective forms of prosumerism have not been adequately described in the legal frameworks of many countries. Second, there is a lack of legal certainty regarding the creation and adaptation of business models in a highly regulated and centralized energy system, which inhibits the emergence of prosumer-driven niche markets. There are also current and prospective challenges relating to the legal status of blockchain-based platforms for facilitating energy transactions, anticipated with the diffusion of blockchain technology. With the rise of prosumerism in the energy sector, the areas of (a) network charges, (b) energy market access, (c) incentive schemes, (d) taxes and levies, and (e) licensing requirements are still uncharted territories in many countries. The uncertainties emanating from this area pose a significant hurdle to the widespread adoption of blockchain technology, a complementary technology that offers added value and competitive advantages for energy systems. The authors undertake a conceptual and theoretical investigation to elucidate the lack of consensus, consistency, and legal certainty in the study of blockchain-based prosumerism. In addition, the authors set an exploratory tone to the discussion by taking an analytically eclectic approach that builds on multiple sources and theories to delve deeper into this topic. As an interdisciplinary study, this research accounts for the convergence of regulation, technology, and the energy sector. The study primarily adopts desk research, which examines regulatory frameworks and conceptual models for crucial policies at the international level to foster an all-inclusive discussion. With their reflections and insights into the interaction of blockchain and prosumerism in the energy sector, the authors do not aim to develop definitive regulatory models or instrument designs, but to contribute to the theoretical dialogue to navigate seminal issues and explore different nuances and pathways. Given the emergence of blockchain-based energy prosumerism, identifying the challenges, gaps and fragmentation of governance regimes is key to facilitating global regulatory transitions.Keywords: blockchain technology, energy sector, prosumer, legal and regulatory.
Procedia PDF Downloads 1816446 Hand Movements and the Effect of Using Smart Teaching Aids: Quality of Writing Styles Outcomes of Pupils with Dysgraphia
Authors: Sadeq Al Yaari, Muhammad Alkhunayn, Sajedah Al Yaari, Adham Al Yaari, Ayman Al Yaari, Montaha Al Yaari, Ayah Al Yaari, Fatehi Eissa
Abstract:
Dysgraphia is a neurological disorder of written expression that impairs writing ability and fine motor skills, resulting primarily in problems relating not only to handwriting but also to writing coherence and cohesion. We investigate the properties of smart writing technology to highlight some unique features of the effects they cause on the academic performance of pupils with dysgraphia. In Amis, dysgraphics undergo writing problems to express their ideas due to ordinary writing aids, as the default strategy. The Amis data suggests a possible connection between available writing aids and pupils’ writing improvement; therefore, texts’ expression and comprehension. A group of thirteen dysgraphic pupils were placed in a regular classroom of primary school, with twenty-one pupils being recruited in the study as a control group. To ensure validity, reliability and accountability to the research, both groups studied writing courses for two semesters, of which the first was equipped with smart writing aids while the second took place in an ordinary classroom. Two pre-tests were undertaken at the beginning of the first two semesters, and two post-tests were administered at the end of both semesters. Tests examined pupils’ ability to write coherent, cohesive and expressive texts. The dysgraphic group received the treatment of a writing course in the first semester in classes with smart technology and produced significantly greater increases in writing expression than in an ordinary classroom, and their performance was better than that of the control group in the second semester. The current study concludes that using smart teaching aids is a ‘MUST’, both for teaching and learning dysgraphia. Furthermore, it is demonstrated that for young dysgraphia, expressive tasks are more challenging than coherent and cohesive tasks. The study, therefore, supports the literature suggesting a role for smart educational aids in writing and that smart writing techniques may be an efficient addition to regular educational practices, notably in special educational institutions and speech-language therapeutic facilities. However, further research is needed to prompt the adults with dysgraphia more often than is done to the older adults without dysgraphia in order to get them to finish the other productive and/or written skills tasks.Keywords: smart technology, writing aids, pupils with dysgraphia, hands’ movement
Procedia PDF Downloads 416445 A Unified Approach for Digital Forensics Analysis
Authors: Ali Alshumrani, Nathan Clarke, Bogdan Ghite, Stavros Shiaeles
Abstract:
Digital forensics has become an essential tool in the investigation of cyber and computer-assisted crime. Arguably, given the prevalence of technology and the subsequent digital footprints that exist, it could have a significant role across almost all crimes. However, the variety of technology platforms (such as computers, mobiles, Closed-Circuit Television (CCTV), Internet of Things (IoT), databases, drones, cloud computing services), heterogeneity and volume of data, forensic tool capability, and the investigative cost make investigations both technically challenging and prohibitively expensive. Forensic tools also tend to be siloed into specific technologies, e.g., File System Forensic Analysis Tools (FS-FAT) and Network Forensic Analysis Tools (N-FAT), and a good deal of data sources has little to no specialist forensic tools. Increasingly it also becomes essential to compare and correlate evidence across data sources and to do so in an efficient and effective manner enabling an investigator to answer high-level questions of the data in a timely manner without having to trawl through data and perform the correlation manually. This paper proposes a Unified Forensic Analysis Tool (U-FAT), which aims to establish a common language for electronic information and permit multi-source forensic analysis. Core to this approach is the identification and development of forensic analyses that automate complex data correlations, enabling investigators to investigate cases more efficiently. The paper presents a systematic analysis of major crime categories and identifies what forensic analyses could be used. For example, in a child abduction, an investigation team might have evidence from a range of sources including computing devices (mobile phone, PC), CCTV (potentially a large number), ISP records, and mobile network cell tower data, in addition to third party databases such as the National Sex Offender registry and tax records, with the desire to auto-correlate and across sources and visualize in a cognitively effective manner. U-FAT provides a holistic, flexible, and extensible approach to providing digital forensics in technology, application, and data-agnostic manner, providing powerful and automated forensic analysis.Keywords: digital forensics, evidence correlation, heterogeneous data, forensics tool
Procedia PDF Downloads 1986444 IOT Based Process Model for Heart Monitoring Process
Authors: Dalyah Y. Al-Jamal, Maryam H. Eshtaiwi, Liyakathunisa Syed
Abstract:
Connecting health services with technology has a huge demand as people health situations are becoming worse day by day. In fact, engaging new technologies such as Internet of Things (IOT) into the medical services can enhance the patient care services. Specifically, patients suffering from chronic diseases such as cardiac patients need a special care and monitoring. In reality, some efforts were previously taken to automate and improve the patient monitoring systems. However, the previous efforts have some limitations and lack the real-time feature needed for chronic kind of diseases. In this paper, an improved process model for patient monitoring system specialized for cardiac patients is presented. A survey was distributed and interviews were conducted to gather the needed requirements to improve the cardiac patient monitoring system. Business Process Model and Notation (BPMN) language was used to model the proposed process. In fact, the proposed system uses the IOT Technology to assist doctors to remotely monitor and follow-up with their heart patients in real-time. In order to validate the effectiveness of the proposed solution, simulation analysis was performed using Bizagi Modeler tool. Analysis results show performance improvements in the heart monitoring process. For the future, authors suggest enhancing the proposed system to cover all the chronic diseases.Keywords: IoT, process model, remote patient monitoring system, smart watch
Procedia PDF Downloads 3346443 Shared Versus Pooled Automated Vehicles: Exploring Behavioral Intentions Towards On-Demand Automated Vehicles
Authors: Samira Hamiditehrani
Abstract:
Automated vehicles (AVs) are emerging technologies that could potentially offer a wide range of opportunities and challenges for the transportation sector. The advent of AV technology has also resulted in new business models in shared mobility services where many ride hailing and car sharing companies are developing on-demand AVs including shared automated vehicles (SAVs) and pooled automated vehicles (Pooled AVs). SAVs and Pooled AVs could provide alternative shared mobility services which encourage sustainable transport systems, mitigate traffic congestion, and reduce automobile dependency. However, the success of on-demand AVs in addressing major transportation policy issues depends on whether and how the public adopts them as regular travel modes. To identify conditions under which individuals may adopt on-demand AVs, previous studies have applied human behavior and technology acceptance theories, where Theory of Planned Behavior (TPB) has been validated and is among the most tested in on-demand AV research. In this respect, this study has three objectives: (a) to propose and validate a theoretical model for behavioral intention to use SAVs and Pooled AVs by extending the original TPB model; (b) to identify the characteristics of early adopters of SAVs, who prefer to have a shorter and private ride, versus prospective users of Pooled AVs, who choose more affordable but longer and shared trips; and (c) to investigate Canadians’ intentions to adopt on-demand AVs for regular trips. Toward this end, this study uses data from an online survey (n = 3,622) of workers or adult students (18 to 75 years old) conducted in October and November 2021 for six major Canadian metropolitan areas: Toronto, Vancouver, Ottawa, Montreal, Calgary, and Hamilton. To accomplish the goals of this study, a base bivariate ordered probit model, in which both SAV and Pooled AV adoptions are estimated as ordered dependent variables, alongside a full structural equation modeling (SEM) system are estimated. The findings of this study indicate that affective motivations such as attitude towards AV technology, perceived privacy, and subjective norms, matter more than sociodemographic and travel behavior characteristic in adopting on-demand AVs. Also, the results of second objective provide evidence that although there are a few affective motivations, such as subjective norms and having ample knowledge, that are common between early adopters of SAVs and PooledAVs, many examined motivations differ among SAV and Pooled AV adoption factors. In other words, motivations influencing intention to use on-demand AVs differ among the service types. Likewise, depending on the types of on-demand AVs, the sociodemographic characteristics of early adopters differ significantly. In general, findings paint a complex picture with respect to the application of constructs from common technology adoption models to the study of on-demand AVs. Findings from the final objective suggest that policymakers, planners, the vehicle and technology industries, and the public at large should moderate their expectations that on-demand AVs may suddenly transform the entire transportation sector. Instead, this study suggests that SAVs and Pooled AVs (when they entire the Canadian market) are likely to be adopted as supplementary mobility tools rather than substitutions for current travel modesKeywords: automated vehicles, Canadian perception, theory of planned behavior, on-demand AVs
Procedia PDF Downloads 746442 Finite-Sum Optimization: Adaptivity to Smoothness and Loopless Variance Reduction
Authors: Bastien Batardière, Joon Kwon
Abstract:
For finite-sum optimization, variance-reduced gradient methods (VR) compute at each iteration the gradient of a single function (or of a mini-batch), and yet achieve faster convergence than SGD thanks to a carefully crafted lower-variance stochastic gradient estimator that reuses past gradients. Another important line of research of the past decade in continuous optimization is the adaptive algorithms such as AdaGrad, that dynamically adjust the (possibly coordinate-wise) learning rate to past gradients and thereby adapt to the geometry of the objective function. Variants such as RMSprop and Adam demonstrate outstanding practical performance that have contributed to the success of deep learning. In this work, we present AdaLVR, which combines the AdaGrad algorithm with loopless variance-reduced gradient estimators such as SAGA or L-SVRG that benefits from a straightforward construction and a streamlined analysis. We assess that AdaLVR inherits both good convergence properties from VR methods and the adaptive nature of AdaGrad: in the case of L-smooth convex functions we establish a gradient complexity of O(n + (L + √ nL)/ε) without prior knowledge of L. Numerical experiments demonstrate the superiority of AdaLVR over state-of-the-art methods. Moreover, we empirically show that the RMSprop and Adam algorithm combined with variance-reduced gradients estimators achieve even faster convergence.Keywords: convex optimization, variance reduction, adaptive algorithms, loopless
Procedia PDF Downloads 726441 An Energy Efficient Spectrum Shaping Scheme for Substrate Integrated Waveguides Based on Spread Reshaping Code
Authors: Yu Zhao, Rainer Gruenheid, Gerhard Bauch
Abstract:
In the microwave and millimeter-wave transmission region, substrate-integrated waveguide (SIW) is a very promising candidate for the development of circuits and components. It facilitates the transmission at the data rates in excess of 200 Gbit/s. An SIW mimics a rectangular waveguide by approximating the closed sidewalls with a via fence. This structure suppresses the low frequency components and makes the channel of the SIW a bandpass or high pass filter. This channel characteristic impedes the conventional baseband transmission using non-return-to-zero (NRZ) pulse shaping scheme. Therefore, mixers are commonly proposed to be used as carrier modulator and demodulator in order to facilitate a passband transmission. However, carrier modulation is not an energy efficient solution, because modulation and demodulation at high frequencies consume a lot of energy. For the first time to our knowledge, this paper proposes a spectrum shaping scheme of low complexity for the channel of SIW, namely spread reshaping code. It aims at matching the spectrum of the transmit signal to the channel frequency response. It facilitates the transmission through the SIW channel while it avoids using carrier modulation. In some cases, it even does not need equalization. Simulations reveal a good performance of this scheme, such that, as a result, eye opening is achieved without any equalization or modulation for the respective transmission channels.Keywords: bandpass channel, eye-opening, switching frequency, substrate-integrated waveguide, spectrum shaping scheme, spread reshaping code
Procedia PDF Downloads 1626440 Using Genetic Algorithms to Outline Crop Rotations and a Cropping-System Model
Authors: Nicolae Bold, Daniel Nijloveanu
Abstract:
The idea of cropping-system is a method used by farmers. It is an environmentally-friendly method, protecting the natural resources (soil, water, air, nutritive substances) and increase the production at the same time, taking into account some crop particularities. The combination of this powerful method with the concepts of genetic algorithms results into a possibility of generating sequences of crops in order to form a rotation. The usage of this type of algorithms has been efficient in solving problems related to optimization and their polynomial complexity allows them to be used at solving more difficult and various problems. In our case, the optimization consists in finding the most profitable rotation of cultures. One of the expected results is to optimize the usage of the resources, in order to minimize the costs and maximize the profit. In order to achieve these goals, a genetic algorithm was designed. This algorithm ensures the finding of several optimized solutions of cropping-systems possibilities which have the highest profit and, thus, which minimize the costs. The algorithm uses genetic-based methods (mutation, crossover) and structures (genes, chromosomes). A cropping-system possibility will be considered a chromosome and a crop within the rotation is a gene within a chromosome. Results about the efficiency of this method will be presented in a special section. The implementation of this method would bring benefits into the activity of the farmers by giving them hints and helping them to use the resources efficiently.Keywords: chromosomes, cropping, genetic algorithm, genes
Procedia PDF Downloads 4296439 Screening and Optimization of Pretreatments for Rice Straw and Their Utilization for Bioethanol Production Using Developed Yeast Strain
Authors: Ganesh Dattatraya Saratale, Min Kyu Oh
Abstract:
Rice straw is one of the most abundant lignocellulosic waste materials and its annual production is about 731 Mt in the world. This study treats the subject of effective utilization of this waste biomass for biofuels production. We have showed a comparative assessment of numerous pretreatment strategies for rice straw, comprising of major physical, chemical and physicochemical methods. Among the different methods employed for pretreatment alkaline pretreatment in combination with sodium chlorite/acetic acid delignification found efficient pretreatment with significant improvement in the enzymatic digestibility of rice straw. A cellulase dose of 20 filter paper units (FPU) released a maximum 63.21 g/L of reducing sugar with 94.45% hydrolysis yield and 64.64% glucose yield from rice straw, respectively. The effects of different pretreatment methods on biomass structure and complexity were investigated by FTIR, XRD and SEM analytical techniques. Finally the enzymatic hydrolysate of rice straw was used for ethanol production using developed Saccharomyces cerevisiae SR8. The developed yeast strain enabled efficient fermentation of xylose and glucose and produced higher ethanol production. Thus development of bioethanol production from lignocellulosic waste biomass is generic, applicable methodology and have great implication for using ‘green raw materials’ and producing ‘green products’ much needed today.Keywords: rice straw, pretreatment, enzymatic hydrolysis, FPU, Saccharomyces cerevisiae SR8, ethanol fermentation
Procedia PDF Downloads 5406438 Power Iteration Clustering Based on Deflation Technique on Large Scale Graphs
Authors: Taysir Soliman
Abstract:
One of the current popular clustering techniques is Spectral Clustering (SC) because of its advantages over conventional approaches such as hierarchical clustering, k-means, etc. and other techniques as well. However, one of the disadvantages of SC is the time consuming process because it requires computing the eigenvectors. In the past to overcome this disadvantage, a number of attempts have been proposed such as the Power Iteration Clustering (PIC) technique, which is one of versions from SC; some of PIC advantages are: 1) its scalability and efficiency, 2) finding one pseudo-eigenvectors instead of computing eigenvectors, and 3) linear combination of the eigenvectors in linear time. However, its worst disadvantage is an inter-class collision problem because it used only one pseudo-eigenvectors which is not enough. Previous researchers developed Deflation-based Power Iteration Clustering (DPIC) to overcome problems of PIC technique on inter-class collision with the same efficiency of PIC. In this paper, we developed Parallel DPIC (PDPIC) to improve the time and memory complexity which is run on apache spark framework using sparse matrix. To test the performance of PDPIC, we compared it to SC, ESCG, ESCALG algorithms on four small graph benchmark datasets and nine large graph benchmark datasets, where PDPIC proved higher accuracy and better time consuming than other compared algorithms.Keywords: spectral clustering, power iteration clustering, deflation-based power iteration clustering, Apache spark, large graph
Procedia PDF Downloads 1926437 Mathematics as the Foundation for the STEM Disciplines: Different Pedagogical Strategies Addressed
Authors: Marion G. Ben-Jacob, David Wang
Abstract:
There is a mathematics requirement for entry level college and university students, especially those who plan to study STEM (Science, Technology, Engineering and Mathematics). Most of them take College Algebra, and to continue their studies, they need to succeed in this course. Different pedagogical strategies are employed to promote the success of our students. There is, of course, the Traditional Method of teaching- lecture, examples, problems for students to solve. The Emporium Model, another pedagogical approach, replaces traditional lectures with a learning resource center model featuring interactive software and on-demand personalized assistance. This presentation will compare these two methods of pedagogy and the study done with its results on this comparison. Math is the foundation for science, technology, and engineering. Its work is generally used in STEM to find patterns in data. These patterns can be used to test relationships, draw general conclusions about data, and model the real world. In STEM, solutions to problems are analyzed, reasoned, and interpreted using math abilities in a assortment of real-world scenarios. This presentation will examine specific examples of how math is used in the different STEM disciplines. Math becomes practical in science when it is used to model natural and artificial experiments to identify a problem and develop a solution for it. As we analyze data, we are using math to find the statistical correlation between the cause of an effect. Scientists who use math include the following: data scientists, scientists, biologists and geologists. Without math, most technology would not be possible. Math is the basis of binary, and without programming, you just have the hardware. Addition, subtraction, multiplication, and division is also used in almost every program written. Mathematical algorithms are inherent in software as well. Mechanical engineers analyze scientific data to design robots by applying math and using the software. Electrical engineers use math to help design and test electrical equipment. They also use math when creating computer simulations and designing new products. Chemical engineers often use mathematics in the lab. Advanced computer software is used to aid in their research and production processes to model theoretical synthesis techniques and properties of chemical compounds. Mathematics mastery is crucial for success in the STEM disciplines. Pedagogical research on formative strategies and necessary topics to be covered are essential.Keywords: emporium model, mathematics, pedagogy, STEM
Procedia PDF Downloads 766436 GeoWeb at the Service of Household Waste Collection in Urban Areas
Authors: Abdessalam Hijab, Eric Henry, Hafida Boulekbache
Abstract:
The complexity of the city makes sustainable management of the urban environment more difficult. Managers are required to make significant human and technical investments, particularly in household waste collection (focus of our research). The aim of this communication is to propose a collaborative geographic multi-actor device (MGCD) based on the link between information and communication technologies (ICT) and geo-web tools in order to involve urban residents in household waste collection processes. Our method is based on a collaborative/motivational concept between the city and its residents. It is a geographic collaboration dedicated to the general public (citizens, residents, and any other participant), based on real-time allocation and geographic location of topological, geographic, and multimedia data in the form of local geo-alerts (location-specific problems) related to household waste in an urban environment. This contribution allows us to understand the extent to which residents can assist and contribute to the development of household waste collection processes for a better protected urban environment. This suggestion provides a good idea of how residents can contribute to the data bank for future uses. Moreover, it will contribute to the transformation of the population into a smart inhabitant as an essential component of a smart city. The proposed model will be tested in the Lamkansa sampling district in Casablanca, Morocco.Keywords: information and communication technologies, ICTs, GeoWeb, geo-collaboration, city, inhabitant, waste, collection, environment
Procedia PDF Downloads 1296435 Advancing Circular Economy Principles: Integrating AI Technology in Street Sanitation for Sustainable Urban Development
Authors: Xukai Fu
Abstract:
The concept of circular economy is interdisciplinary, intersecting environmental engineering, information technology, business, and social science domains. Over the course of its 15-year tenure in the sanitation industry, Jinkai has concentrated its efforts in the past five years on integrating artificial intelligence (AI) technology with street sanitation apparatus and systems. This endeavor has led to the development of various innovations, including the Intelligent Identification Sweeper Truck (Intelligent Waste Recognition and Energy-saving Control System), the Intelligent Identification Water Truck (Intelligent Flushing Control System), the intelligent food waste treatment machine, and the Intelligent City Road Sanitation Surveillance Platform. This study will commence with an examination of prevalent global challenges, elucidating how Jinkai effectively addresses each within the framework of circular economy principles. Utilizing a review and analysis of pertinent environmental management data, we will elucidate Jinkai's strategic approach. Following this, we will investigate how Jinkai utilizes the advantages of circular economy principles to guide the design of street sanitation machinery, with a focus on digitalization integration. Moreover, we will scrutinize Jinkai's sustainable practices throughout the invention and operation phases of street sanitation machinery, aligning with the triple bottom line theory. Finally, we will delve into the significance and enduring impact of corporate social responsibility (CSR) and environmental, social, and governance (ESG) initiatives. Special emphasis will be placed on Jinkai's contributions to community stakeholders, with a particular emphasis on human rights. Despite the widespread adoption of circular economy principles across various industries, achieving a harmonious equilibrium between environmental justice and social justice remains a formidable task. Jinkai acknowledges that the mere development of energy-saving technologies is insufficient for authentic circular economy implementation; rather, they serve as instrumental tools. To earnestly promote and embody circular economy principles, companies must consistently prioritize the UN Sustainable Development Goals and adapt their technologies to address the evolving exigencies of our world.Keywords: circular economy, core principles, benefits, the tripple bottom line, CSR, ESG, social justice, human rights, Jinkai
Procedia PDF Downloads 506434 Technoscience in the Information Society
Authors: A. P. Moiseeva, Z. S. Zavyalova
Abstract:
This paper focuses on the Technoscience phenomenon and its role in modern society. It gives a review of the latest research on Technoscience. Based on the works of Paul Forman, Bernadette Bensaude-Vincent, Bruno Latour, Maria Caramez Carlotto and others, the authors consider the concept of Technoscience, its specific character and prospects of its development.Keywords: technoscience, information society, transdisciplinarity, European Technology Platforms
Procedia PDF Downloads 6666433 An Integrated Architecture of E-Learning System to Digitize the Learning Method
Authors: M. Touhidul Islam Sarker, Mohammod Abul Kashem
Abstract:
The purpose of this paper is to improve the e-learning system and digitize the learning method in the educational sector. The learner will login into e-learning platform and easily access the digital content, the content can be downloaded and take an assessment for evaluation. Learner can get access to these digital resources by using tablet, computer, and smart phone also. E-learning system can be defined as teaching and learning with the help of multimedia technologies and the internet by access to digital content. E-learning replacing the traditional education system through information and communication technology-based learning. This paper has designed and implemented integrated e-learning system architecture with University Management System. Moodle (Modular Object-Oriented Dynamic Learning Environment) is the best e-learning system, but the problem of Moodle has no school or university management system. In this research, we have not considered the school’s student because they are out of internet facilities. That’s why we considered the university students because they have the internet access and used technologies. The University Management System has different types of activities such as student registration, account management, teacher information, semester registration, staff information, etc. If we integrated these types of activity or module with Moodle, then we can overcome the problem of Moodle, and it will enhance the e-learning system architecture which makes effective use of technology. This architecture will give the learner to easily access the resources of e-learning platform anytime or anywhere which digitizes the learning method.Keywords: database, e-learning, LMS, Moodle
Procedia PDF Downloads 1896432 Future Research on the Resilience of Tehran’s Urban Areas Against Pandemic Crises Horizon 2050
Authors: Farzaneh Sasanpour, Saeed Amini Varaki
Abstract:
Resilience is an important goal for cities as urban areas face an increasing range of challenges in the 21st century; therefore, according to the characteristics of risks, adopting an approach that responds to sensitive conditions in the risk management process is the resilience of cities. In the meantime, most of the resilience assessments have dealt with natural hazards and less attention has been paid to pandemics.In the covid-19 pandemic, the country of Iran and especially the metropolis of Tehran, was not immune from the crisis caused by its effects and consequences and faced many challenges. One of the methods that can increase the resilience of Tehran's metropolis against possible crises in the future is future studies. This research is practical in terms of type. The general pattern of the research will be descriptive-analytical and from the point of view that it is trying to communicate between the components and provide urban resilience indicators with pandemic crises and explain the scenarios, its future studies method is exploratory. In order to extract and determine the key factors and driving forces effective on the resilience of Tehran's urban areas against pandemic crises (Covid-19), the method of structural analysis of mutual effects and Micmac software was used. Therefore, the primary factors and variables affecting the resilience of Tehran's urban areas were set in 5 main factors, including physical-infrastructural (transportation, spatial and physical organization, streets and roads, multi-purpose development) with 39 variables based on mutual effects analysis. Finally, key factors and variables in five main areas, including managerial-institutional with five variables; Technology (intelligence) with 3 variables; economic with 2 variables; socio-cultural with 3 variables; and physical infrastructure, were categorized with 7 variables. These factors and variables have been used as key factors and effective driving forces on the resilience of Tehran's urban areas against pandemic crises (Covid-19), in explaining and developing scenarios. In order to develop the scenarios for the resilience of Tehran's urban areas against pandemic crises (Covid-19), intuitive logic, scenario planning as one of the future research methods and the Global Business Network (GBN) model were used. Finally, four scenarios have been drawn and selected with a creative method using the metaphor of weather conditions, which is indicative of the general outline of the conditions of the metropolis of Tehran in that situation. Therefore, the scenarios of Tehran metropolis were obtained in the form of four scenarios: 1- solar scenario (optimal governance and management leading in smart technology) 2- cloud scenario (optimal governance and management following in intelligent technology) 3- dark scenario (optimal governance and management Unfavorable leader in intelligence technology) 4- Storm scenario (unfavorable governance and management of follower in intelligence technology). The solar scenario shows the best situation and the stormy scenario shows the worst situation for the Tehran metropolis. According to the findings obtained in this research, city managers can, in order to achieve a better tomorrow for the metropolis of Tehran, in all the factors and components of urban resilience against pandemic crises by using future research methods, a coherent picture with the long-term horizon of 2050, from the path Provide urban resilience movement and platforms for upgrading and increasing the capacity to deal with the crisis. To create the necessary platforms for the realization, development and evolution of the urban areas of Tehran in a way that guarantees long-term balance and stability in all dimensions and levels.Keywords: future research, resilience, crisis, pandemic, covid-19, Tehran
Procedia PDF Downloads 716431 Workforce Optimization: Fair Workload Balance and Near-Optimal Task Execution Order
Authors: Alvaro Javier Ortega
Abstract:
A large number of companies face the challenge of matching highly-skilled professionals to high-end positions by human resource deployment professionals. However, when the professional list and tasks to be matched are larger than a few dozens, this process result is far from optimal and takes a long time to be made. Therefore, an automated assignment algorithm for this workforce management problem is needed. The majority of companies are divided into several sectors or departments, where trained employees with different experience levels deal with a large number of tasks daily. Also, the execution order of all tasks is of mater consequence, due to some of these tasks just can be run it if the result of another task is provided. Thus, a wrong execution order leads to large waiting times between consecutive tasks. The desired goal is, therefore, creating accurate matches and a near-optimal execution order that maximizes the number of tasks performed and minimizes the idle time of the expensive skilled employees. The problem described before can be model as a mixed-integer non-linear programming (MINLP) as it will be shown in detail through this paper. A large number of MINLP algorithms have been proposed in the literature. Here, genetic algorithm solutions are considered and a comparison between two different mutation approaches is presented. The simulated results considering different complexity levels of assignment decisions show the appropriateness of the proposed model.Keywords: employees, genetic algorithm, industry management, workforce
Procedia PDF Downloads 1696430 A Case Study of Clinicians’ Perceptions of Enterprise Content Management at Tygerberg Hospital
Authors: Temitope O. Tokosi
Abstract:
Healthcare is a human right. The sensitivity of health issues has necessitated the introduction of Enterprise Content Management (ECM) at district hospitals in the Western Cape Province of South Africa. The objective is understanding clinicians’ perception of ECM at their workplace. It is a descriptive case study design of constructivist paradigm. It employed a phenomenological data analysis method using a pattern matching deductive based analytical procedure. Purposive and s4nowball sampling techniques were applied in selecting participants. Clinicians expressed concerns and frustrations using ECM such as, non-integration with other hospital systems. Inadequate access points to ECM. Incorrect labelling of notes and bar-coding causes more time wasted in finding information. System features and/or functions (such as search and edit) are not possible. Hospital management and clinicians are not constantly interacting and discussing. Information turnaround time is unacceptably lengthy. Resolving these problems would involve a positive working relationship between hospital management and clinicians. In addition, prioritising the problems faced by clinicians in relation to relevance can ensure problem-solving in order to meet clinicians’ expectations and hospitals’ objective. Clinicians’ perception should invoke attention from hospital management with regards technology use. The study’s results can be generalised across clinician groupings exposed to ECM at various district hospitals because of professional and hospital homogeneity.Keywords: clinician, electronic content management, hospital, perception, technology
Procedia PDF Downloads 2356429 Leadership Development of Professional Ethiopian Women in Science, Technology, Engineering, and Mathematics: Insights Gained through an Onsite Culturally Embedded Workshop
Authors: Araceli Martinez Ortiz, Gillian Bayne, Solomon Abraham
Abstract:
This paper describes research led by faculty from three American universities and four Ethiopian universities on the delivery of professional leadership development for early-career female Ethiopian university instructors in the Science, Technology, Engineering, and Mathematics (STEM) fields. The objective was to carry out a case study focused on the impact of an innovative intervention program designed to assist in the empowerment and leadership development related to teaching effectiveness, scholarly activity participation, and professional service participation by female instructors. This research was conducted utilizing a case study methodology for the weeklong intervention and a survey to capture the voices of the leadership program participants. The data regarding insights into the challenges and opportunities for women in these fields is presented. The research effort project expands upon existing linkages between universities to support professional development and research effort in this region of the world. Findings indicate the positive reception of this kind of professional development by the participating women. Survey data also reflects the particular cultural challenges professional women in STEM education face in Ethiopia as well as the global challenges of balancing family expectations with career development.Keywords: Ethiopian women, STEM leadership, professional development, gender equity
Procedia PDF Downloads 1136428 Identification of Hepatocellular Carcinoma Using Supervised Learning Algorithms
Authors: Sagri Sharma
Abstract:
Analysis of diseases integrating multi-factors increases the complexity of the problem and therefore, development of frameworks for the analysis of diseases is an issue that is currently a topic of intense research. Due to the inter-dependence of the various parameters, the use of traditional methodologies has not been very effective. Consequently, newer methodologies are being sought to deal with the problem. Supervised Learning Algorithms are commonly used for performing the prediction on previously unseen data. These algorithms are commonly used for applications in fields ranging from image analysis to protein structure and function prediction and they get trained using a known dataset to come up with a predictor model that generates reasonable predictions for the response to new data. Gene expression profiles generated by DNA analysis experiments can be quite complex since these experiments can involve hypotheses involving entire genomes. The application of well-known machine learning algorithm - Support Vector Machine - to analyze the expression levels of thousands of genes simultaneously in a timely, automated and cost effective way is thus used. The objectives to undertake the presented work are development of a methodology to identify genes relevant to Hepatocellular Carcinoma (HCC) from gene expression dataset utilizing supervised learning algorithms and statistical evaluations along with development of a predictive framework that can perform classification tasks on new, unseen data.Keywords: artificial intelligence, biomarker, gene expression datasets, hepatocellular carcinoma, machine learning, supervised learning algorithms, support vector machine
Procedia PDF Downloads 4306427 Advanced Combinatorial Method for Solving Complex Fault Trees
Authors: José de Jesús Rivero Oliva, Jesús Salomón Llanes, Manuel Perdomo Ojeda, Antonio Torres Valle
Abstract:
Combinatorial explosion is a common problem to both predominant methods for solving fault trees: Minimal Cut Set (MCS) approach and Binary Decision Diagram (BDD). High memory consumption impedes the complete solution of very complex fault trees. Only approximated non-conservative solutions are possible in these cases using truncation or other simplification techniques. The paper proposes a method (CSolv+) for solving complex fault trees, without any possibility of combinatorial explosion. Each individual MCS is immediately discarded after its contribution to the basic events importance measures and the Top gate Upper Bound Probability (TUBP) has been accounted. An estimation of the Top gate Exact Probability (TEP) is also provided. Therefore, running in a computer cluster, CSolv+ will guarantee the complete solution of complex fault trees. It was successfully applied to 40 fault trees from the Aralia fault trees database, performing the evaluation of the top gate probability, the 1000 Significant MCSs (SMCS), and the Fussell-Vesely, RRW and RAW importance measures for all basic events. The high complexity fault tree nus9601 was solved with truncation probabilities from 10-²¹ to 10-²⁷ just to limit the execution time. The solution corresponding to 10-²⁷ evaluated 3.530.592.796 MCSs in 3 hours and 15 minutes.Keywords: system reliability analysis, probabilistic risk assessment, fault tree analysis, basic events importance measures
Procedia PDF Downloads 476426 Artificial Intelligence for Generative Modelling
Authors: Shryas Bhurat, Aryan Vashistha, Sampreet Dinakar Nayak, Ayush Gupta
Abstract:
As the technology is advancing more towards high computational resources, there is a paradigm shift in the usage of these resources to optimize the design process. This paper discusses the usage of ‘Generative Design using Artificial Intelligence’ to build better models that adapt the operations like selection, mutation, and crossover to generate results. The human mind thinks of the simplest approach while designing an object, but the intelligence learns from the past & designs the complex optimized CAD Models. Generative Design takes the boundary conditions and comes up with multiple solutions with iterations to come up with a sturdy design with the most optimal parameter that is given, saving huge amounts of time & resources. The new production techniques that are at our disposal allow us to use additive manufacturing, 3D printing, and other innovative manufacturing techniques to save resources and design artistically engineered CAD Models. Also, this paper discusses the Genetic Algorithm, the Non-Domination technique to choose the right results using biomimicry that has evolved for current habitation for millions of years. The computer uses parametric models to generate newer models using an iterative approach & uses cloud computing to store these iterative designs. The later part of the paper compares the topology optimization technology with Generative Design that is previously being used to generate CAD Models. Finally, this paper shows the performance of algorithms and how these algorithms help in designing resource-efficient models.Keywords: genetic algorithm, bio mimicry, generative modeling, non-dominant techniques
Procedia PDF Downloads 1526425 Entrepreneurial Leadership in Malaysian Public University: Competency and Behavior in the Face of Institutional Adversity
Authors: Noorlizawati Abd Rahim, Zainai Mohamed, Zaidatun Tasir, Astuty Amrin, Haliyana Khalid, Nina Diana Nawi
Abstract:
Entrepreneurial leaders have been sought as in-demand talents to lead profit-driven organizations during turbulent and unprecedented times. However, research regarding the pertinence of their roles in the public sector has been limited. This paper examined the characteristics of the challenging experiences encountered by senior leaders in public universities that require them to embrace entrepreneurialism in their leadership. Through a focus group interview with five Malaysian university top senior leaders with experience being Vice-Chancellor, we explored and developed a framework of institutional adversity characteristics and exemplary entrepreneurial leadership competency in the face of adversity. Complexity of diverse stakeholders, multiplicity of academic disciplines, unfamiliarity to lead different and broader roles, leading new directions, and creating change in high velocity and uncertain environment are among the dimensions that characterise institutional adversities. Our findings revealed that learning agility, opportunity recognition capacity, and bridging capability are among the characteristics of entrepreneurial university leaders. The findings reinforced that the presence of specific attributes in institutional adversity and experiences in overcoming those challenges may contribute to the development of entrepreneurial leadership capabilities.Keywords: bridging capability, entrepreneurial leadership, leadership development, learning agility, opportunity recognition, university leaders
Procedia PDF Downloads 1126424 Multi-Objective Optimization in Carbon Abatement Technology Cycles (CAT) and Related Areas: Survey, Developments and Prospects
Authors: Hameed Rukayat Opeyemi, Pericles Pilidis, Pagone Emanuele
Abstract:
An infinitesimal increase in performance can have immense reduction in operating and capital expenses in a power generation system. Therefore, constant studies are being carried out to improve both conventional and novel power cycles. Globally, power producers are constantly researching on ways to minimize emission and to collectively downsize the total cost rate of power plants. A substantial spurt of developmental technologies of low carbon cycles have been suggested and studied, however they all have their limitations and financial implication. In the area of carbon abatement in power plants, three major objectives conflict: The cost rate of the plant, Power output and Environmental impact. Since, an increase in one of this parameter directly affects the other. This poses a multi-objective problem. It is paramount to be able to discern the point where improving one objective affects the other. Hence, the need for a Pareto-based optimization algorithm. Pareto-based optimization algorithm helps to find those points where improving one objective influences another objective negatively and stops there. The application of Pareto-based optimization algorithm helps the user/operator/designer make an informed decision. This paper sheds more light on areas that multi-objective optimization has been applied in carbon abatement technologies in the last five years, developments and prospects.Keywords: gas turbine, low carbon technology, pareto optimal, multi-objective optimization
Procedia PDF Downloads 792