Search results for: mapping methodologies
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2091

Search results for: mapping methodologies

1671 Advancing Spatial Mapping and Monitoring of Illegal Landfills for Deprived Urban Areas in Romania

Authors: ȘercăIanu Mihai, Aldea Mihaela, Iacoboaea Cristina, Luca Oana, Nenciu Ioana

Abstract:

The emergence and neutralization of illegal waste dumps represent a global concern for waste management ecosystems with a particularly pronounced impact on disadvantaged communities. All over the world, and in this particular case in Romania, a relevant number of people resided in houses lacking any legal forms such as land ownership documents or building permits. These areas are referred to as “informal settlements”. An increasing number of regions and cities in Romania are struggling to manage their waste dumps, especially in the context of increasing poverty and lack of regulation related to informal settlements. An example of such informal settlement can be found at the terminus of Bistra Street in Câlnic, which falls under the jurisdiction of the Municipality of Reșița in Caras Severin County. The article presents a case study that focuses on employing remote sensing techniques and spatial data to monitor and map illegal waste practices, with subsequent integration into a geographic information system tailored for the Reșița community. In addition, the paper outlines the steps involved in devising strategies aimed at enhancing waste management practices in disadvantaged areas, aligning with the shift toward a circular economy. Results presented in the paper contain a spatial mapping and visualization methodology calibrated with in situ data collection applicable for identifying illegal landfills. The emergence and neutralization of illegal dumps pose a challenge in the field of waste management. These approaches, which prove effective where conventional solutions have failed, need to be replicated and adopted more wisely.

Keywords: waste dumps, waste management, monitoring, GIS, informal settlements

Procedia PDF Downloads 44
1670 The Adoption of Leagility in Healthcare Services

Authors: Ana L. Martins, Luis Orfão

Abstract:

Healthcare systems have been subject to various research efforts aiming at process improvement under a lean approach. Another perspective, agility, has also been used, though in a lower scale, in order to analyse the ability of different hospital services to adapt to demand uncertainties. Both perspectives have a common denominator, the improvement of effectiveness and efficiency of the services in a healthcare setting context. Mixing the two approached allows, on one hand, to streamline the processes, and on the other hand the required flexibility to deal with demand uncertainty in terms of both volume and variety. The present research aims to analyse the impacts of the combination of both perspectives in the effectiveness and efficiency of an hospital service. The adopted methodology is based on a case study approach applied to the process of the ambulatory surgery service of Hospital de Lamego. Data was collected from direct observations, formal interviews and informal conversations. The analyzed process was selected according to three criteria: relevance of the process to the hospital, presence of human resources, and presence of waste. The customer of the process was identified as well as his perception of value. The process was mapped using flow chart, on a process modeling perspective, as well as through the use of Value Stream Mapping (VSM) and Process Activity Mapping. The Spaghetti Diagram was also used to assess flow intensity. The use of the lean tools enabled the identification of three main types of waste: movement, resource inefficiencies and process inefficiencies. From the use of the lean tools improvement suggestions were produced. The results point out that leagility cannot be applied to the process, but the application of lean and agility in specific areas of the process would bring benefits in both efficiency and effectiveness, and contribute to value creation if improvements are introduced in hospital’s human resources and facilities management.

Keywords: case study, healthcare systems, leagility, lean management

Procedia PDF Downloads 181
1669 Investigation of Different Machine Learning Algorithms in Large-Scale Land Cover Mapping within the Google Earth Engine

Authors: Amin Naboureh, Ainong Li, Jinhu Bian, Guangbin Lei, Hamid Ebrahimy

Abstract:

Large-scale land cover mapping has become a new challenge in land change and remote sensing field because of involving a big volume of data. Moreover, selecting the right classification method, especially when there are different types of landscapes in the study area is quite difficult. This paper is an attempt to compare the performance of different machine learning (ML) algorithms for generating a land cover map of the China-Central Asia–West Asia Corridor that is considered as one of the main parts of the Belt and Road Initiative project (BRI). The cloud-based Google Earth Engine (GEE) platform was used for generating a land cover map for the study area from Landsat-8 images (2017) by applying three frequently used ML algorithms including random forest (RF), support vector machine (SVM), and artificial neural network (ANN). The selected ML algorithms (RF, SVM, and ANN) were trained and tested using reference data obtained from MODIS yearly land cover product and very high-resolution satellite images. The finding of the study illustrated that among three frequently used ML algorithms, RF with 91% overall accuracy had the best result in producing a land cover map for the China-Central Asia–West Asia Corridor whereas ANN showed the worst result with 85% overall accuracy. The great performance of the GEE in applying different ML algorithms and handling huge volume of remotely sensed data in the present study showed that it could also help the researchers to generate reliable long-term land cover change maps. The finding of this research has great importance for decision-makers and BRI’s authorities in strategic land use planning.

Keywords: land cover, google earth engine, machine learning, remote sensing

Procedia PDF Downloads 96
1668 Landcover Mapping Using Lidar Data and Aerial Image and Soil Fertility Degradation Assessment for Rice Production Area in Quezon, Nueva Ecija, Philippines

Authors: Eliza. E. Camaso, Guiller. B. Damian, Miguelito. F. Isip, Ronaldo T. Alberto

Abstract:

Land-cover maps were important for many scientific, ecological and land management purposes and during the last decades, rapid decrease of soil fertility was observed to be due to land use practices such as rice cultivation. High-precision land-cover maps are not yet available in the area which is important in an economy management. To assure   accurate mapping of land cover to provide information, remote sensing is a very suitable tool to carry out this task and automatic land use and cover detection. The study did not only provide high precision land cover maps but it also provides estimates of rice production area that had undergone chemical degradation due to fertility decline. Land-cover were delineated and classified into pre-defined classes to achieve proper detection features. After generation of Land-cover map, of high intensity of rice cultivation, soil fertility degradation assessment in rice production area due to fertility decline was created to assess the impact of soils used in agricultural production. Using Simple spatial analysis functions and ArcGIS, the Land-cover map of Municipality of Quezon in Nueva Ecija, Philippines was overlaid to the fertility decline maps from Land Degradation Assessment Philippines- Bureau of Soils and Water Management (LADA-Philippines-BSWM) to determine the area of rice crops that were most likely where nitrogen, phosphorus, zinc and sulfur deficiencies were induced by high dosage of urea and imbalance N:P fertilization. The result found out that 80.00 % of fallow and 99.81% of rice production area has high soil fertility decline.

Keywords: aerial image, landcover, LiDAR, soil fertility degradation

Procedia PDF Downloads 236
1667 (Re)connecting to the Spirit of the Language: Decolonizing from Eurocentric Indigenous Language Revitalization Methodologies

Authors: Lana Whiskeyjack, Kyle Napier

Abstract:

The Spirit of the language embodies the motivation for indigenous people to connect with the indigenous language of their lineage. While the concept of the spirit of the language is often woven into the discussion by indigenous language revitalizationists, particularly those who are indigenous, there are few tangible terms in academic research conceptually actualizing the term. Through collaborative work with indigenous language speakers, elders, and learners, this research sets out to identify the spirit of the language, the catalysts of disconnection from the spirit of the language, and the sources of reconnection to the spirit of the language. This work fundamentally addresses the terms of engagement around collaboration with indigenous communities, itself inviting a decolonial approach to community outreach and individual relationships. As indigenous researchers, this means beginning, maintain, and closing this work in the ceremony while being transparent with community members in this work and related publishing throughout the project’s duration. Decolonizing this approach also requires maintaining explicit ongoing consent by the elders, knowledge keepers, and community members when handling their ancestral and indigenous knowledge. The handling of this knowledge is regarded in this work as stewardship, both in the handling of digital materials and the handling of ancestral Indigenous knowledge. This work observes recorded conversations in both nêhiyawêwin and English, resulting from 10 semi-structured interviews with fluent nêhiyawêwin speakers as well as three structured dialogue circles with fluent and emerging speakers. The words were transcribed by a speaker fluent in both nêhiyawêwin and English. The results of those interviews were categorized thematically to conceptually actualize the spirit of the language, catalysts of disconnection to thespirit of the language, and community voices methods of reconnection to the spirit of the language. Results of these interviews vastly determine that the spirit of the language is drawn from the land. Although nêhiyawêwin is the focus of this work, Indigenous languages are by nature inherently related to the land. This is further reaffirmed by the Indigenous language learners and speakers who expressed having ancestries and lineages from multiple Indigenous communities. Several other key differences embody this spirit of the language, which include ceremony and spirituality, as well as the semantic worldviews tied to polysynthetic verb-oriented morphophonemics most often found in indigenous languages — and of focus, nêhiyawêwin. The catalysts of disconnection to the spirit of the language are those whose histories have severed connections between Indigenous Peoples and the spirit of their languages or those that have affected relationships with the land, ceremony, and ways of thinking. Results of this research and its literature review have determined the three most ubiquitously damaging interdependent factors, which are catalysts of disconnection from the spirit of the language as colonization, capitalism, and Christianity. As voiced by the Indigenous language learners, this work necessitates addressing means to reconnect to the spirit of the language. Interviewees mentioned that the process of reconnection involves a whole relationship with the land, the practice of reciprocal-relational methodologies for language learning, and indigenous-protected and -governed learning. This work concludes in support of those reconnection methodologies.

Keywords: indigenous language acquisition, indigenous language reclamation, indigenous language revitalization, nêhiyawêwin, spirit of the language

Procedia PDF Downloads 125
1666 Comparison of Support Vector Machines and Artificial Neural Network Classifiers in Characterizing Threatened Tree Species Using Eight Bands of WorldView-2 Imagery in Dukuduku Landscape, South Africa

Authors: Galal Omer, Onisimo Mutanga, Elfatih M. Abdel-Rahman, Elhadi Adam

Abstract:

Threatened tree species (TTS) play a significant role in ecosystem functioning and services, land use dynamics, and other socio-economic aspects. Such aspects include ecological, economic, livelihood, security-based, and well-being benefits. The development of techniques for mapping and monitoring TTS is thus critical for understanding the functioning of ecosystems. The advent of advanced imaging systems and supervised learning algorithms has provided an opportunity to classify TTS over fragmenting landscape. Recently, vegetation maps have been produced using advanced imaging systems such as WorldView-2 (WV-2) and robust classification algorithms such as support vectors machines (SVM) and artificial neural network (ANN). However, delineation of TTS in a fragmenting landscape using high resolution imagery has widely remained elusive due to the complexity of the species structure and their distribution. Therefore, the objective of the current study was to examine the utility of the advanced WV-2 data for mapping TTS in the fragmenting Dukuduku indigenous forest of South Africa using SVM and ANN classification algorithms. The results showed the robustness of the two machine learning algorithms with an overall accuracy (OA) of 77.00% (total disagreement = 23.00%) for SVM and 75.00% (total disagreement = 25.00%) for ANN using all eight bands of WV-2 (8B). This study concludes that SVM and ANN classification algorithms with WV-2 8B have the potential to classify TTS in the Dukuduku indigenous forest. This study offers relatively accurate information that is important for forest managers to make informed decisions regarding management and conservation protocols of TTS.

Keywords: artificial neural network, threatened tree species, indigenous forest, support vector machines

Procedia PDF Downloads 491
1665 Buy-and-Hold versus Alternative Strategies: A Comparison of Market-Timing Techniques

Authors: Jonathan J. Burson

Abstract:

With the rise of virtually costless, mobile-based trading platforms, stock market trading activity has increased significantly over the past decade, particularly for the millennial generation. This increased stock market attention, combined with the recent market turmoil due to the economic upset caused by COVID-19, make the topics of market-timing and forecasting particularly relevant. While the overall stock market saw an unprecedented, historically-long bull market from March 2009 to February 2020, the end of that bull market reignited a search by investors for a way to reduce risk and increase return. Similar searches for outperformance occurred in the early, and late 2000’s as the Dotcom bubble burst and the Great Recession led to years of negative returns for mean-variance, index investors. Extensive research has been conducted on fundamental analysis, technical analysis, macroeconomic indicators, microeconomic indicators, and other techniques—all using different methodologies and investment periods—in pursuit of higher returns with lower risk. The enormous variety of timeframes, data, and methodologies used by the diverse forecasting methods makes it difficult to compare the outcome of each method directly to other methods. This paper establishes a process to evaluate the market-timing methods in an apples-to-apples manner based on simplicity, performance, and feasibility. Preliminary findings show that certain technical analysis models provide a higher return with lower risk when compared to the buy-and-hold method and to other market-timing strategies. Furthermore, technical analysis models tend to be easier for individual investors both in terms of acquiring the data and in analyzing it, making technical analysis-based market-timing methods the preferred choice for retail investors.

Keywords: buy-and-hold, forecast, market-timing, probit, technical analysis

Procedia PDF Downloads 77
1664 The Status of Precision Agricultural Technology Adoption on Row Crop Farms vs. Specialty Crop Farms

Authors: Shirin Ghatrehsamani

Abstract:

Higher efficiency and lower environmental impact are the consequence of using advanced technology in farming. They also help to decrease yield variability by diminishing weather variability impact, optimizing nutrient and pest management as well as reducing competition from weeds. A better understanding of the pros and cons of applying technology and finding the main reason for preventing the utilization of the technology has a significant impact on developing technology adoption among farmers and producers in the digital agriculture era. The results from two surveys carried out in 2019 and 2021 were used to investigate whether the crop types had an impact on the willingness to utilize technology on the farms. The main focus of the questionnaire was on utilizing precision agriculture (PA) technologies among farmers in some parts of the united states. Collected data was analyzed to determine the practical application of various technologies. The survey results showed more similarities in the main reason not to use PA between the two crop types, but the present application of using technology in specialty crops is generally five times larger than in row crops. GPS receiver applications were reported similar for both types of crops. Lack of knowledge and high cost of data handling were cited as the main problems. The most significant difference was among using variable rate technology, which was 43% for specialty crops while was reported 0% for row crops. Pest scouting and mapping were commonly used for specialty crops, while they were rarely applied for row crops. Survey respondents found yield mapping, soil sampling map, and irrigation scheduling were more valuable for specialty crops than row crops in management decisions. About 50% of the respondents would like to share the PA data in both types of crops. Almost 50 % of respondents got their PA information from retailers in both categories, and as the second source, using extension agents were more common in specialty crops than row crops.

Keywords: precision agriculture, smart farming, digital agriculture, technology adoption

Procedia PDF Downloads 89
1663 Ancient Cities of Deltaic Bengal: Origin and Nature on the Riverine Bed of Ganges Valley

Authors: Sajid Bin Doza

Abstract:

A town or a city contributes a lot to human mankind. City evolves memory, ambition, frustration and achievement. The city is something that offers life, as the character of the city is. A city is having confined image to the human being. Time place and matter generate this vive, city celebrates with its inhabitant, belongs and to care for each other. Apart from all these; although city and settlements are the contentious and changing phenomenon; the origin of the city in the very delta land started with unique and strategic sequences. Religious belief, topography, availability of resource and connection with commercial hub make the potential of the settlement. Ancient cities of Bengal are not the exception from these phenomenologies. From time immemorial; Bengal is enriched with numerous cities and notorious settlements. These cities and settlements were connected with other inland ports and Bengal became an important trade route, trailed by the Riverine connections. The delta land formation is valued for its geographic situation, consequences of this position; a new story or a new conception could be found in origin of an ancient city. However, the objective of this research is to understand the origin and spirit of the ancient city of Bengal, the research would also try to unfold the authentic and rational meaning of soul of the city, this research addresses the interest to elaborate the soul of the ancient sites of Riverine Delta. As rivers used to have the common character in this very landform; river supported community generated as well. River gives people wealth, sometimes fall us in sorrow. The river provides us commerce and trading. River gives us faith and religion. All these potentials have evolved from the Riverine excel. So the research would approach thoroughly to justify the riverine value as the soul for the ancient cities of Bengal. Cartographic information and illustration would be the preferred language for this research. Preferably, the historic mapping would be the unique folio of this study.

Keywords: memory of the city, riverine network, ancient cities, cartographic mapping, settlement pattern

Procedia PDF Downloads 272
1662 A Method for Multimedia User Interface Design for Mobile Learning

Authors: Shimaa Nagro, Russell Campion

Abstract:

Mobile devices are becoming ever more widely available, with growing functionality, and are increasingly used as an enabling technology to give students access to educational material anytime and anywhere. However, the design of educational material user interfaces for mobile devices is beset by many unresolved research issues such as those arising from emphasising the information concepts then mapping this information to appropriate media (modelling information then mapping media effectively). This report describes a multimedia user interface design method for mobile learning. The method covers specification of user requirements and information architecture, media selection to represent the information content, design for directing attention to important information, and interaction design to enhance user engagement based on Human-Computer Interaction design strategies (HCI). The method will be evaluated by three different case studies to prove the method is suitable for application to different areas / applications, these are; an application to teach about major computer networking concepts, an application to deliver a history-based topic; (after these case studies have been completed, the method will be revised to remove deficiencies and then used to develop a third case study), an application to teach mathematical principles. At this point, the method will again be revised into its final format. A usability evaluation will be carried out to measure the usefulness and effectiveness of the method. The investigation will combine qualitative and quantitative methods, including interviews and questionnaires for data collection and three case studies for validating the MDMLM method. The researcher has successfully produced the method at this point which is now under validation and testing procedures. From this point forward in the report, the researcher will refer to the method using the MDMLM abbreviation which means Multimedia Design Mobile Learning Method.

Keywords: human-computer interaction, interface design, mobile learning, education

Procedia PDF Downloads 222
1661 Adult Language Learning in the Institute of Technology Sector in the Republic of Ireland

Authors: Una Carthy

Abstract:

A recent study of third level institutions in Ireland reveals that both age and aptitude can be overcome by teaching methodologies to motivate second language learners. This PhD investigation gathered quantitative and qualitative data from 14 Institutes of Technology over a three years period from 2011 to 2014. The fundamental research question was to establish the impact of institutional language policy on attitudes towards language learning. However, other related issues around second language acquisition arose in the course of the investigation. Data were collected from both lectures and students, allowing interesting points of comparison to emerge from both datasets. Negative perceptions among lecturers regarding language provision were often associated with the view that language learning belongs to primary and secondary level and has no place in third level education. This perception was offset by substantial data showing positive attitudes towards adult language learning. Lenneberg’s Critical Age Theory postulated that the optimum age for learning a second language is before puberty. More recently, scholars have challenged this theory in their studies, revealing that mature learners can and do succeed at learning languages. With regard to aptitude, a preoccupation among lecturers regarding poor literacy skills among students emerged and was often associated with resistance to second language acquisition. This was offset by a preponderance of qualitative data from students highlighting the crucial role which teaching approaches play in the learning process. Interestingly, the data collected regarding learning disabilities reveals that, given the appropriate learning environments, individuals can be motivated to acquire second languages, and indeed succeed at learning them. These findings are in keeping with other recent studies regarding attitudes towards second language learning among students with learning disabilities. Both sets of findings reinforce the case for language policies in the Institute of Technology (IoTs). Supportive and positive learning environments can be created in third level institutions to motivate adult learners, thereby overcoming perceived obstacles relating to age and aptitude.

Keywords: age, aptitude, second language acquisition, teaching methodologies

Procedia PDF Downloads 102
1660 Evaluation of Coupled CFD-FEA Simulation for Fire Determination

Authors: Daniel Martin Fellows, Sean P. Walton, Jennifer Thompson, Oubay Hassan, Ella Quigley, Kevin Tinkham

Abstract:

Fire performance is a crucial aspect to consider when designing cladding products, and testing this performance is extremely expensive. Appropriate use of numerical simulation of fire performance has the potential to reduce the total number of fire tests required when designing a product by eliminating poor-performing design ideas early in the design phase. Due to the complexity of fire and the large spectrum of failures it can cause, multi-disciplinary models are needed to capture the complex fire behavior and its structural effects on its surroundings. Working alongside Tata Steel U.K., the authors have focused on completing a coupled CFD-FEA simulation model suited to test Polyisocyanurate (PIR) based sandwich panel products to gain confidence before costly experimental standards testing. The sandwich panels are part of a thermally insulating façade system primarily for large non-domestic buildings. The work presented in this paper compares two coupling methodologies of a replicated physical experimental standards test LPS 1181-1, carried out by Tata Steel U.K. The two coupling methodologies that are considered within this research are; one-way and two-way. A one-way coupled analysis consists of importing thermal data from the CFD solver into the FEA solver. A two-way coupling analysis consists of continuously importing the updated changes in thermal data, due to the fire's behavior, to the FEA solver throughout the simulation. Likewise, the mechanical changes will also be updated back to the CFD solver to include geometric changes within the solution. For CFD calculations, a solver called Fire Dynamic Simulator (FDS) has been chosen due to its adapted numerical scheme to focus solely on fire problems. Validation of FDS applicability has been achieved in past benchmark cases. In addition, an FEA solver called ABAQUS has been chosen to model the structural response to the fire due to its crushable foam plasticity model, which can accurately model the compressibility of PIR foam. An open-source code called FDS-2-ABAQUS is used to couple the two solvers together, using several python modules to complete the process, including failure checks. The coupling methodologies and experimental data acquired from Tata Steel U.K are compared using several variables. The comparison data includes; gas temperatures, surface temperatures, and mechanical deformation of the panels. Conclusions are drawn, noting improvements to be made on the current coupling open-source code FDS-2-ABAQUS to make it more applicable to Tata Steel U.K sandwich panel products. Future directions for reducing the computational cost of the simulation are also considered.

Keywords: fire engineering, numerical coupling, sandwich panels, thermo fluids

Procedia PDF Downloads 65
1659 Real-Space Mapping of Surface Trap States in Cigse Nanocrystals Using 4D Electron Microscopy

Authors: Riya Bose, Ashok Bera, Manas R. Parida, Anirudhha Adhikari, Basamat S. Shaheen, Erkki Alarousu, Jingya Sun, Tom Wu, Osman M. Bakr, Omar F. Mohammed

Abstract:

This work reports visualization of charge carrier dynamics on the surface of copper indium gallium selenide (CIGSe) nanocrystals in real space and time using four-dimensional scanning ultrafast electron microscopy (4D S-UEM) and correlates it with the optoelectronic properties of the nanocrystals. The surface of the nanocrystals plays a key role in controlling their applicability for light emitting and light harvesting purposes. Typically for quaternary systems like CIGSe, which have many desirable attributes to be used for optoelectronic applications, relative abundance of surface trap states acting as non-radiative recombination centre for charge carriers remains as a major bottleneck preventing further advancements and commercial exploitation of these nanocrystals devices. Though ultrafast spectroscopic techniques allow determining the presence of picosecond carrier trapping channels, because of relative larger penetration depth of the laser beam, only information mainly from the bulk of the nanocrystals is obtained. Selective mapping of such ultrafast dynamical processes on the surfaces of nanocrystals remains as a key challenge, so far out of reach of purely optical probing time-resolved laser techniques. In S-UEM, the optical pulse generated from a femtosecond (fs) laser system is used to generate electron packets from the tip of the scanning electron microscope, instead of the continuous electron beam used in the conventional setup. This pulse is synchronized with another optical excitation pulse that initiates carrier dynamics in the sample. The principle of S-UEM is to detect the secondary electrons (SEs) generated in the sample, which is emitted from the first few nanometers of the top surface. Constructed at different time delays between the optical and electron pulses, these SE images give direct and precise information about the carrier dynamics on the surface of the material of interest. In this work, we report selective mapping of surface dynamics in real space and time of CIGSe nanocrystals applying 4D S-UEM. We show that the trap states can be considerably passivated by ZnS shelling of the nanocrystals, and the carrier dynamics can be significantly slowed down. We also compared and discussed the S-UEM kinetics with the carrier dynamics obtained from conventional ultrafast time-resolved techniques. Additionally, a direct effect of the state trap removal can be observed in the enhanced photoresponse of the nanocrystals after shelling. Direct observation of surface dynamics will not only provide a profound understanding of the photo-physical mechanisms on nanocrystals’ surfaces but also enable to unlock their full potential for light emitting and harvesting applications.

Keywords: 4D scanning ultrafast microscopy, charge carrier dynamics, nanocrystals, optoelectronics, surface passivation, trap states

Procedia PDF Downloads 271
1658 Handwriting Recognition of Gurmukhi Script: A Survey of Online and Offline Techniques

Authors: Ravneet Kaur

Abstract:

Character recognition is a very interesting area of pattern recognition. From past few decades, an intensive research on character recognition for Roman, Chinese, and Japanese and Indian scripts have been reported. In this paper, a review of Handwritten Character Recognition work on Indian Script Gurmukhi is being highlighted. Most of the published papers were summarized, various methodologies were analysed and their results are reported.

Keywords: Gurmukhi character recognition, online, offline, HCR survey

Procedia PDF Downloads 403
1657 Contribution to the Study of Automatic Epileptiform Pattern Recognition in Long Term EEG Signals

Authors: Christine F. Boos, Fernando M. Azevedo

Abstract:

Electroencephalogram (EEG) is a record of the electrical activity of the brain that has many applications, such as monitoring alertness, coma and brain death; locating damaged areas of the brain after head injury, stroke and tumor; monitoring anesthesia depth; researching physiology and sleep disorders; researching epilepsy and localizing the seizure focus. Epilepsy is a chronic condition, or a group of diseases of high prevalence, still poorly explained by science and whose diagnosis is still predominantly clinical. The EEG recording is considered an important test for epilepsy investigation and its visual analysis is very often applied for clinical confirmation of epilepsy diagnosis. Moreover, this EEG analysis can also be used to help define the types of epileptic syndrome, determine epileptiform zone, assist in the planning of drug treatment and provide additional information about the feasibility of surgical intervention. In the context of diagnosis confirmation the analysis is made using long term EEG recordings with at least 24 hours long and acquired by a minimum of 24 electrodes in which the neurophysiologists perform a thorough visual evaluation of EEG screens in search of specific electrographic patterns called epileptiform discharges. Considering that the EEG screens usually display 10 seconds of the recording, the neurophysiologist has to evaluate 360 screens per hour of EEG or a minimum of 8,640 screens per long term EEG recording. Analyzing thousands of EEG screens in search patterns that have a maximum duration of 200 ms is a very time consuming, complex and exhaustive task. Because of this, over the years several studies have proposed automated methodologies that could facilitate the neurophysiologists’ task of identifying epileptiform discharges and a large number of methodologies used neural networks for the pattern classification. One of the differences between all of these methodologies is the type of input stimuli presented to the networks, i.e., how the EEG signal is introduced in the network. Five types of input stimuli have been commonly found in literature: raw EEG signal, morphological descriptors (i.e. parameters related to the signal’s morphology), Fast Fourier Transform (FFT) spectrum, Short-Time Fourier Transform (STFT) spectrograms and Wavelet Transform features. This study evaluates the application of these five types of input stimuli and compares the classification results of neural networks that were implemented using each of these inputs. The performance of using raw signal varied between 43 and 84% efficiency. The results of FFT spectrum and STFT spectrograms were quite similar with average efficiency being 73 and 77%, respectively. The efficiency of Wavelet Transform features varied between 57 and 81% while the descriptors presented efficiency values between 62 and 93%. After simulations we could observe that the best results were achieved when either morphological descriptors or Wavelet features were used as input stimuli.

Keywords: Artificial neural network, electroencephalogram signal, pattern recognition, signal processing

Procedia PDF Downloads 503
1656 Identification and Classification of Stakeholders in the Transition to 3D Cadastre

Authors: Qiaowen Lin

Abstract:

The 3D cadastre is an inevitable choice to meet the needs of real cadastral management. Nowadays, more attention is given to the technical aspects of 3D cadastre, resulting in the imbalance within this field. To fulfill this research gap, the stakeholder, which has been regarded as the determining factor in cadastral change has been studied. Delphi method, Michael rating, and stakeholder mapping are used to identify and classify the stakeholders in 3D cadastre. It is concluded that the project managers should pay more attention to the interesting appeal of the key stakeholders and different coping strategies should be adopted to facilitate the transition to 3D cadastre.

Keywords: stakeholders, three dimension, cadastre, transtion

Procedia PDF Downloads 266
1655 Efficiency Enhancement in Solar Panel

Authors: R. S. Arun Raj

Abstract:

In today's climate of growing energy needs and increasing environmental issues, alternatives to the use of non-renewable and polluting fossil fuels have to be investigated. One such alternative is the solar energy. The SUN provides every hour as much energy as mankind consumes in one year. This paper clearly explains about the solar panel design and new models and methodologies that can be implemented for better utilization of solar energy. Minimisation of losses in solar panel as heat is my innovative idea revolves around. The pay back calculations by implementation of solar panels is also quoted.

Keywords: on-grid and off-grid systems, pyro-electric effect, pay-back calculations, solar panel

Procedia PDF Downloads 559
1654 Surveying Apps in Dam Excavation

Authors: Ali Mohammadi

Abstract:

Whenever there is a need to dig the ground, the presence of a surveyor is required to control the map. In projects such as dams and tunnels, these controls are more important because any mistakes can increase the cost. Also, time is great importance in These projects have and one of the ways to reduce the drilling time is to use techniques that can reduce the mapping time in these projects. Nowadays, with the existence of mobile phones, we can design apps that perform calculations and drawing for us on the mobile phone. Also, if we have a device that requires a computer to access its information, by designing an app, we can transfer its information to the mobile phone and use it, so we will not need to go to the office.

Keywords: app, tunnel, excavation, dam

Procedia PDF Downloads 33
1653 Learning in the Virtual Laboratory via Design of Automation Process for Wooden Hammers Marking

Authors: A. Javorova, J. Oravcova, K. Velisek

Abstract:

The article summarizes the experience of technical subjects teaching methodologies using a number of software products to solve specific assigned tasks described in this paper. Task is about the problems of automation and mechanization in the industry. Specifically, it focuses on introducing automation in the wood industry. The article describes the design of the automation process for marking wooden hammers. Similar problems are solved by students in CA laboratory.

Keywords: CA system, education, simulation, subject

Procedia PDF Downloads 272
1652 Dual Biometrics Fusion Based Recognition System

Authors: Prakash, Vikash Kumar, Vinay Bansal, L. N. Das

Abstract:

Dual biometrics is a subpart of multimodal biometrics, which refers to the use of a variety of modalities to identify and authenticate persons rather than just one. We limit the risks of mistakes by mixing several modals, and hackers have a tiny possibility of collecting information. Our goal is to collect the precise characteristics of iris and palmprint, produce a fusion of both methodologies, and ensure that authentication is only successful when the biometrics match a particular user. After combining different modalities, we created an effective strategy with a mean DI and EER of 2.41 and 5.21, respectively. A biometric system has been proposed.

Keywords: multimodal, fusion, palmprint, Iris, EER, DI

Procedia PDF Downloads 123
1651 Brain Connectome of Glia, Axons, and Neurons: Cognitive Model of Analogy

Authors: Ozgu Hafizoglu

Abstract:

An analogy is an essential tool of human cognition that enables connecting diffuse and diverse systems with physical, behavioral, principal relations that are essential to learning, discovery, and innovation. The Cognitive Model of Analogy (CMA) leads and creates patterns of pathways to transfer information within and between domains in science, just as happens in the brain. The connectome of the brain shows how the brain operates with mental leaps between domains and mental hops within domains and the way how analogical reasoning mechanism operates. This paper demonstrates the CMA as an evolutionary approach to science, technology, and life. The model puts forward the challenges of deep uncertainty about the future, emphasizing the need for flexibility of the system in order to enable reasoning methodology to adapt to changing conditions in the new era, especially post-pandemic. In this paper, we will reveal how to draw an analogy to scientific research to discover new systems that reveal the fractal schema of analogical reasoning within and between the systems like within and between the brain regions. Distinct phases of the problem-solving processes are divided thusly: stimulus, encoding, mapping, inference, and response. Based on the brain research so far, the system is revealed to be relevant to brain activation considering each of these phases with an emphasis on achieving a better visualization of the brain’s mechanism in macro context; brain and spinal cord, and micro context: glia and neurons, relative to matching conditions of analogical reasoning and relational information, encoding, mapping, inference and response processes, and verification of perceptual responses in four-term analogical reasoning. Finally, we will relate all these terminologies with these mental leaps, mental maps, mental hops, and mental loops to make the mental model of CMA clear.

Keywords: analogy, analogical reasoning, brain connectome, cognitive model, neurons and glia, mental leaps, mental hops, mental loops

Procedia PDF Downloads 146
1650 Prospectivity Mapping of Orogenic Lode Gold Deposits Using Fuzzy Models: A Case Study of Saqqez Area, Northwestern Iran

Authors: Fanous Mohammadi, Majid H. Tangestani, Mohammad H. Tayebi

Abstract:

This research aims to evaluate and compare Geographical Information Systems (GIS)-based fuzzy models for producing orogenic gold prospectivity maps in the Saqqez area, NW of Iran. Gold occurrences are hosted in sericite schist and mafic to felsic meta-volcanic rocks in this area and are associated with hydrothermal alterations that extend over ductile to brittle shear zones. The predictor maps, which represent the Pre-(Source/Trigger/Pathway), syn-(deposition/physical/chemical traps) and post-mineralization (preservation/distribution of indicator minerals) subsystems for gold mineralization, were generated using empirical understandings of the specifications of known orogenic gold deposits and gold mineral systems and were then pre-processed and integrated to produce mineral prospectivity maps. Five fuzzy logic operators, including AND, OR, Fuzzy Algebraic Product (FAP), Fuzzy Algebraic Sum (FAS), and GAMMA, were applied to the predictor maps in order to find the most efficient prediction model. Prediction-Area (P-A) plots and field observations were used to assess and evaluate the accuracy of prediction models. Mineral prospectivity maps generated by AND, OR, FAP, and FAS operators were inaccurate and, therefore, unable to pinpoint the exact location of discovered gold occurrences. The GAMMA operator, on the other hand, produced acceptable results and identified potentially economic target sites. The P-A plot revealed that 68 percent of known orogenic gold deposits are found in high and very high potential regions. The GAMMA operator was shown to be useful in predicting and defining cost-effective target sites for orogenic gold deposits, as well as optimizing mineral deposit exploitation.

Keywords: mineral prospectivity mapping, fuzzy logic, GIS, orogenic gold deposit, Saqqez, Iran

Procedia PDF Downloads 99
1649 Mapping of Forest Cover Change in the Democratic Republic of the Congo

Authors: Armand Okende, Benjamin Beaumont

Abstract:

Introduction: Deforestation is a change in the structure and composition of flora and fauna, which leads to a loss of biodiversity, production of goods and services and an increase in fires. It concerns vast territories in tropical zones particularly; this is the case of the territory of Bolobo in the current province of Maï- Ndombe in the Democratic Republic of Congo. Indeed, through this study between 2001 and 2018, we believe that it was important to show and analyze quantitatively the important forests changes and analyze quantitatively. It’s the overall objective of this study because, in this area, we are witnessing significant deforestation. Methodology: Mapping and quantification are the methodological approaches that we have put forward to assess the deforestation or forest changes through satellite images or raster layers. These satellites data from Global Forest Watch are integrated into the GIS software (GRASS GIS and Quantum GIS) to represent the loss of forest cover that has occurred and the various changes recorded (e.g., forest gain) in the territory of Bolobo. Results: The results obtained show, in terms of quantifying deforestation for the periods 2001-2006, 2007-2012 and 2013-2018, the loss of forest area in hectares each year. The different change maps produced during different study periods mentioned above show that the loss of forest areas is gradually increasing. Conclusion: With this study, knowledge of forest management and protection is a challenge to ensure good management of forest resources. To do this, it is wise to carry out more studies that would optimize the monitoring of forests to guarantee the ecological and economic functions they provide in the Congo Basin, particularly in the Democratic Republic of Congo. In addition, the cartographic approach, coupled with the geographic information system and remote sensing proposed by Global Forest Watch using raster layers, provides interesting information to explain the loss of forest areas.

Keywords: deforestation, loss year, forest change, remote sensing, drivers of deforestation

Procedia PDF Downloads 113
1648 Estimation of Fragility Curves Using Proposed Ground Motion Selection and Scaling Procedure

Authors: Esra Zengin, Sinan Akkar

Abstract:

Reliable and accurate prediction of nonlinear structural response requires specification of appropriate earthquake ground motions to be used in nonlinear time history analysis. The current research has mainly focused on selection and manipulation of real earthquake records that can be seen as the most critical step in the performance based seismic design and assessment of the structures. Utilizing amplitude scaled ground motions that matches with the target spectra is commonly used technique for the estimation of nonlinear structural response. Representative ground motion ensembles are selected to match target spectrum such as scenario-based spectrum derived from ground motion prediction equations, Uniform Hazard Spectrum (UHS), Conditional Mean Spectrum (CMS) or Conditional Spectrum (CS). Different sets of criteria exist among those developed methodologies to select and scale ground motions with the objective of obtaining robust estimation of the structural performance. This study presents ground motion selection and scaling procedure that considers the spectral variability at target demand with the level of ground motion dispersion. The proposed methodology provides a set of ground motions whose response spectra match target median and corresponding variance within a specified period interval. The efficient and simple algorithm is used to assemble the ground motion sets. The scaling stage is based on the minimization of the error between scaled median and the target spectra where the dispersion of the earthquake shaking is preserved along the period interval. The impact of the spectral variability on nonlinear response distribution is investigated at the level of inelastic single degree of freedom systems. In order to see the effect of different selection and scaling methodologies on fragility curve estimations, results are compared with those obtained by CMS-based scaling methodology. The variability in fragility curves due to the consideration of dispersion in ground motion selection process is also examined.

Keywords: ground motion selection, scaling, uncertainty, fragility curve

Procedia PDF Downloads 564
1647 Biotechnological Interventions for Crop Improvement in Nutricereal Pearl Millet

Authors: Supriya Ambawat, Subaran Singh, C. Tara Satyavathi, B. S. Rajpurohit, Ummed Singh, Balraj Singh

Abstract:

Pearl millet [Pennisetum glaucum (L.) R. Br.] is an important staple food of the arid and semiarid tropical regions of Asia, Africa, and Latin America. It is rightly termed as nutricereal as it has high nutrition value and a good source of carbohydrate, protein, fat, ash, dietary fiber, potassium, magnesium, iron, zinc, etc. Pearl millet has low prolamine fraction and is gluten free which is useful for people having a gluten allergy. It has several health benefits like reduction in blood pressure, thyroid, diabe¬tes, cardiovascular and celiac diseases but its direct consumption as food has significantly declined due to several reasons. Keeping this in view, it is important to reorient the ef¬forts to generate demand through value-addition and quality improvement and create awareness on the nutritional merits of pearl millet. In India, through Indian Council of Agricultural Research-All India Coordinated Research Project on Pearl millet, multilocational coordinated trials for developed hybrids were conducted at various centers. The gene banks of pearl millet contain varieties with high levels of iron and zinc which were used to produce new pearl millet varieties with elevated iron levels bred with the high‐yielding varieties. Thus, using breeding approaches and biochemical analysis, a total of 167 hybrids and 61 varieties were identified and released for cultivation in different agro-ecological zones of the country which also includes some biofortified hybrids rich in Fe and Zn. Further, using several biotechnological interventions such as molecular markers, next-generation sequencing (NGS), association mapping, nested association mapping (NAM), MAGIC populations, genome editing, genotyping by sequencing (GBS), genome wide association studies (GWAS) advancement in millet improvement has become possible by identifying and tagging of genes underlying a trait in the genome. Using DArT markers very high density linkage maps were constructed for pearl millet. Improved HHB67 has been released using marker assisted selection (MAS) strategies, and genomic tools were used to identify Fe-Zn Quantitative Trait Loci (QTL). The draft genome sequence of millet has also opened various ways to explore pearl millet. Further, genomic positions of significantly associated simple sequence repeat (SSR) markers with iron and zinc content in the consensus map is being identified and research is in progress towards mapping QTLs for flour rancidity. The sequence information is being used to explore genes and enzymatic pathways responsible for rancidity of flour. Thus, development and application of several biotechnological approaches along with biofortification can accelerate the genetic gain targets for pearl millet improvement and help improve its quality.

Keywords: Biotechnological approaches, genomic tools, malnutrition, MAS, nutricereal, pearl millet, sequencing.

Procedia PDF Downloads 147
1646 A Reading Attempt of the Urban Memory of Jordan University of Science and Technology Campus by Cognitive Mapping

Authors: Bsma Adel Bany Mohammad

Abstract:

The University campuses are a small city containing basic city functions such as educational spaces, accommodations, services and transportation. They are spaces of functional and social life with different activities, different occupants. The campus designed and transformed like cities so both experienced and memorized in same way. Campus memory is the ability of individuals to maintain and reveal the spatial components of designed physical spaces, which form the understandings, experiences, sensations of the environment in all. ‘Cognitive mapping’ is used to decode the physical interaction and emotional relationship between individuals and the city; Cognitive maps are created graphically using geometric and verbal elements on paper by remembering the images of the Urban Environment. In this study, to determine the emotional urban identity belonging to Jordan University of science and technology Campus, architecture students Asked to identify the areas they interact with in the campus by drawing a cognitive map. ‘Campus memory items’ are identified by analyzing the cognitive maps of the campus, then the spatial identity result of such data. The analysis based on the five basic elements of Lynch: paths, districts, edges, nodes, and landmarks. As a result of this analysis, it found that Spatial Identity constructed by the shared elements of the maps. The memory of most students listed the gates structure- which is a large desirable structure, located at the main entrances within the campus defined as major landmarks, then the square spaces defined as nodes, in addition to both stairs and corridors defined as paths. Finally, the districts, edges of educational buildings and service spaces are listed correspondingly in cognitive maps. Findings suggest that the spatial identity of the campus design is related mainly to the gates structures, squares and stairs.

Keywords: cognitive maps, university campus, urban memory, identity

Procedia PDF Downloads 127
1645 Play-Based Early Education and Teachers’ Professional Development: Impact on Vulnerable Children

Authors: Chirine Dannaoui, Maya Antoun

Abstract:

This paper explores the intricate dynamics of play-based early childhood education (ECE) and the impact of professional development on teachers implementing play-based pedagogy, particularly in the context of vulnerable Syrian refugee children in Lebanon. By utilizing qualitative methodologies, including classroom observations and in-depth interviews with five early childhood educators and a field manager, this study delves into the challenges and transformations experienced by teachers in adopting play-based learning strategies. The research unveils the critical role of continuous and context-specific professional development in empowering teachers to implement play-based pedagogies effectively. When appropriately supported, it emphasizes how such educational approaches significantly enhance children's cognitive, social, and emotional development in crisis-affected environments. Key findings indicate that despite diverse educational backgrounds, teachers show considerable growth in their pedagogical skills through targeted professional development. This growth is vital for fostering a learning environment where vulnerable children can thrive, particularly in humanitarian settings. The paper also addresses educators' challenges, including adapting to play-based methodologies, resource limitations, and balancing curricular requirements with the need for holistic child development. This study contributes to the discourse on early childhood education in crisis contexts, emphasizing the need for sustainable, well-structured professional development programs. It underscores the potential of play-based learning to bridge educational gaps and contribute to the healing process of children facing calamity. The study highlights significant implications for policymakers, educators, schools, and not-for-profit organizations engaged in early childhood education in humanitarian contexts, stressing the importance of investing in teacher capacity and curriculum reform to enhance the quality of education for children in general and vulnerable ones in particular.

Keywords: play-based learning, professional development, vulnerable children, early childhood education

Procedia PDF Downloads 34
1644 Mining Scientific Literature to Discover Potential Research Data Sources: An Exploratory Study in the Field of Haemato-Oncology

Authors: A. Anastasiou, K. S. Tingay

Abstract:

Background: Discovering suitable datasets is an important part of health research, particularly for projects working with clinical data from patients organized in cohorts (cohort data), but with the proliferation of so many national and international initiatives, it is becoming increasingly difficult for research teams to locate real world datasets that are most relevant to their project objectives. We present a method for identifying healthcare institutes in the European Union (EU) which may hold haemato-oncology (HO) data. A key enabler of this research was the bibInsight platform, a scientometric data management and analysis system developed by the authors at Swansea University. Method: A PubMed search was conducted using HO clinical terms taken from previous work. The resulting XML file was processed using the bibInsight platform, linking affiliations to the Global Research Identifier Database (GRID). GRID is an international, standardized list of institutions, including the city and country in which the institution exists, as well as a category of the main business type, e.g., Academic, Healthcare, Government, Company. Countries were limited to the 28 current EU members, and institute type to 'Healthcare'. An article was considered valid if at least one author was affiliated with an EU-based healthcare institute. Results: The PubMed search produced 21,310 articles, consisting of 9,885 distinct affiliations with correspondence in GRID. Of these articles, 760 were from EU countries, and 390 of these were healthcare institutes. One affiliation was excluded as being a veterinary hospital. Two EU countries did not have any publications in our analysis dataset. The results were analysed by country and by individual healthcare institute. Networks both within the EU and internationally show institutional collaborations, which may suggest a willingness to share data for research purposes. Geographical mapping can ensure that data has broad population coverage. Collaborations with industry or government may exclude healthcare institutes that may have embargos or additional costs associated with data access. Conclusions: Data reuse is becoming increasingly important both for ensuring the validity of results, and economy of available resources. The ability to identify potential, specific data sources from over twenty thousand articles in less than an hour could assist in improving knowledge of, and access to, data sources. As our method has not yet specified if these healthcare institutes are holding data, or merely publishing on that topic, future work will involve text mining of data-specific concordant terms to identify numbers of participants, demographics, study methodologies, and sub-topics of interest.

Keywords: data reuse, data discovery, data linkage, journal articles, text mining

Procedia PDF Downloads 96
1643 Application of Sentinel-2 Data to Evaluate the Role of Mangrove Conservation and Restoration on Aboveground Biomass

Authors: Raheleh Farzanmanesh, Christopher J. Weston

Abstract:

Mangroves are forest ecosystems located in the inter-tidal regions of tropical and subtropical coastlines that provide many valuable economic and ecological benefits for millions of people, such as preventing coastal erosion, providing breeding, and feeding grounds, improving water quality, and supporting the well-being of local communities. In addition, mangroves capture and store high amounts of carbon in biomass and soils that play an important role in combating climate change. The decline in mangrove area has prompted government and private sector interest in mangrove conservation and restoration projects to achieve multiple Sustainable Development Goals, from reducing poverty to improving life on land. Mangrove aboveground biomass plays an essential role in the global carbon cycle, climate change mitigation and adaptation by reducing CO2 emissions. However, little information is available about the effectiveness of mangrove sustainable management on mangrove change area and aboveground biomass (AGB). Here, we proposed a method for mapping, modeling, and assessing mangrove area and AGB in two Global Environment Facility (GEF) blue forests projects based on Sentinel-2 Level 1C imagery during their conservation lifetime. The SVR regression model was used to estimate AGB in Tahiry Honko project in Madagascar and the Abu Dhabi Blue Carbon Demonstration Project (Abu Dhabi Emirates. The results showed that mangrove forests and AGB declined in the Tahiry Honko project, while in the Abu Dhabi project increased after the conservation initiative was established. The results provide important information on the impact of mangrove conservation activities and contribute to the development of remote sensing applications for mapping and assessing mangrove forests in blue carbon initiatives.

Keywords: blue carbon, mangrove forest, REDD+, aboveground biomass, Sentinel-2

Procedia PDF Downloads 48
1642 Clinical Advice Services: Using Lean Chassis to Optimize Nurse-Driven Telephonic Triage of After-Hour Calls from Patients

Authors: Eric Lee G. Escobedo-Wu, Nidhi Rohatgi, Fouzel Dhebar

Abstract:

It is challenging for patients to navigate through healthcare systems after-hours. This leads to delays in care, patient/provider dissatisfaction, inappropriate resource utilization, readmissions, and higher costs. It is important to provide patients and providers with effective clinical decision-making tools to allow seamless connectivity and coordinated care. In August 2015, patient-centric Stanford Health Care established Clinical Advice Services (CAS) to provide clinical decision support after-hours. CAS is founded on key Lean principles: Value stream mapping, empathy mapping, waste walk, takt time calculations, standard work, plan-do-check-act cycles, and active daily management. At CAS, Clinical Assistants take the initial call and manage all non-clinical calls (e.g., appointments, directions, general information). If the patient has a clinical symptom, the CAS nurses take the call and utilize standardized clinical algorithms to triage the patient to home, clinic, urgent care, emergency department, or 911. Nurses may also contact the on-call physician based on the clinical algorithm for further direction and consultation. Since August 2015, CAS has managed 228,990 calls from 26 clinical specialties. Reporting is built into the electronic health record for analysis and data collection. 65.3% of the after-hours calls are clinically related. Average clinical algorithm adherence rate has been 92%. An average of 9% of calls was escalated by CAS nurses to the physician on call. An average of 5% of patients was triaged to the Emergency Department by CAS. Key learnings indicate that a seamless connectivity vision, cascading, multidisciplinary ownership of the problem, and synergistic enterprise improvements have contributed to this success while striving for continuous improvement.

Keywords: after hours phone calls, clinical advice services, nurse triage, Stanford Health Care

Procedia PDF Downloads 151