Search results for: personalized mapping
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1470

Search results for: personalized mapping

1230 The Routes of Human Suffering: How Point-Source and Destination-Source Mapping Can Help Victim Services Providers and Law Enforcement Agencies Effectively Combat Human Trafficking

Authors: Benjamin Thomas Greer, Grace Cotulla, Mandy Johnson

Abstract:

Human trafficking is one of the fastest growing international crimes and human rights violations in the world. The United States Department of State (State Department) approximates some 800,000 to 900,000 people are annually trafficked across sovereign borders, with approximately 14,000 to 17,500 of these people coming into the United States. Today’s slavery is conducted by unscrupulous individuals who are often connected to organized criminal enterprises and transnational gangs, extracting huge monetary sums. According to the International Labour Organization (ILO), human traffickers collect approximately $32 billion worldwide annually. Surpassed only by narcotics dealing, trafficking of humans is tied with illegal arms sales as the second largest criminal industry in the world and is the fastest growing field in the 21st century. Perpetrators of this heinous crime abound. They are not limited to single or “sole practitioners” of human trafficking, but rather, often include Transnational Criminal Organizations (TCO), domestic street gangs, labor contractors, and otherwise seemingly ordinary citizens. Monetary gain is being elevated over territorial disputes and street gangs are increasingly operating in a collaborative effort with TCOs to further disguise their criminal activity; to utilizing their vast networks, in an attempt to avoid detection. Traffickers rely on a network of clandestine routes to sell their commodities with impunity. As law enforcement agencies seek to retard the expansion of transnational criminal organization’s entry into human trafficking, it is imperative that they develop reliable trafficking mapping of known exploitative routes. In a recent report given to the Mexican Congress, The Procuraduría General de la República (PGR) disclosed, from 2008 to 2010 they had identified at least 47 unique criminal networking routes used to traffic victims and that Mexico’s estimated domestic victims number between 800,000 adults and 20,000 children annually. Designing a reliable mapping system is a crucial step to effective law enforcement response and deploying a successful victim support system. Creating this mapping analytic is exceedingly difficult. Traffickers are constantly changing the way they traffic and exploit their victims. They swiftly adapt to local environmental factors and react remarkably well to market demands, exploiting limitations in the prevailing laws. This article will highlight how human trafficking has become one of the fastest growing and most high profile human rights violations in the world today; compile current efforts to map and illustrate trafficking routes; and will demonstrate how the proprietary analytical mapping analysis of point-source and destination-source mapping can help local law enforcement, governmental agencies and victim services providers effectively respond to the type and nature of trafficking to their specific geographical locale. Trafficking transcends state and international borders. It demands an effective and consistent cooperation between local, state, and federal authorities. Each region of the world has different impact factors which create distinct challenges for law enforcement and victim services. Our mapping system lays the groundwork for a targeted anti-trafficking response.

Keywords: human trafficking, mapping, routes, law enforcement intelligence

Procedia PDF Downloads 382
1229 Services-Oriented Model for the Regulation of Learning

Authors: Mohamed Bendahmane, Brahim Elfalaki, Mohammed Benattou

Abstract:

One of the major sources of learners' professional difficulties is their heterogeneity. Whether on cognitive, social, cultural or emotional level, learners being part of the same group have many differences. These differences do not allow to apply the same learning process at all learners. Thus, an optimal learning path for one, is not necessarily the same for the other. We present in this paper a model-oriented service to offer to each learner a personalized learning path to acquire the targeted skills.

Keywords: learning path, web service, trace analysis, personalization

Procedia PDF Downloads 357
1228 AI-Enhanced Self-Regulated Learning: Proposing a Comprehensive Model with 'Studium' to Meet a Student-Centric Perspective

Authors: Smita Singh

Abstract:

Objective: The Faculty of Chemistry Education at Humboldt University has developed ‘Studium’, a web application designed to enhance long-term self-regulated learning (SRL) and academic achievement. Leveraging advanced generative AI, ‘Studium’ offers a dynamic and adaptive educational experience tailored to individual learning preferences and languages. The application includes evolving tools for personalized notetaking from preferred sources, customizable presentation capabilities, and AI-assisted guidance from academic documents or textbooks. It also features workflow automation and seamless integration with collaborative platforms like Miro, powered by AI. This study aims to propose a model that combines generative AI with traditional features and customization options, empowering students to create personalized learning environments that effectively address the challenges of SRL. Method: To achieve this, the study included graduate and undergraduate students from diverse subject streams, with 15 participants each from Germany and India, ensuring a diverse educational background. An exploratory design was employed using a speed dating method with enactment, where different scenario sessions were created to allow participants to experience various features of ‘Studium’. The session lasted for 50 minutes, providing an in-depth exploration of the platform's capabilities. Participants interacted with Studium’s features via Zoom conferencing and were then engaged in semi-structured interviews lasting 10-15 minutes to gain deeper insights into the effectiveness of ‘Studium’. Additionally, online questionnaire surveys were conducted before and after the session to gather feedback and evaluate satisfaction with self-regulated learning (SRL) after using ‘Studium’. The response rate of this survey was 100%. Results: The findings of this study indicate that students widely acknowledged the positive impact of ‘Studium’ on their learning experience, particularly its adaptability and intuitive design. They expressed a desire for more tools like ‘Studium’ to support self-regulated learning in the future. The application significantly fostered students' independence in organizing information and planning study workflows, which in turn enhanced their confidence in mastering complex concepts. Additionally, ‘Studium’ promoted strategic decision-making and helped students overcome various learning challenges, reinforcing their self-regulation, organization, and motivation skills. Conclusion: This proposed model emphasizes the need for effective integration of personalized AI tools into active learning and SRL environments. By addressing key research questions, our framework aims to demonstrate how AI-assisted platforms like “Studium” can facilitate deeper understanding, maintain student motivation, and support the achievement of academic goals. Thus, our ideal model for AI-assisted educational platforms provides a strategic approach to enhance student's learning experiences and promote their development as self-regulated learners. This proposed model emphasizes the need for effective integration of personalized AI tools into active learning and SRL environments. By addressing key research questions, our framework aims to demonstrate how AI-assisted platforms like ‘Studium’ can facilitate deeper understanding, maintain student motivation, and support the achievement of academic goals. Thus, our ideal model for AI-assisted educational platforms provides a strategic approach to enhance student's learning experiences and promote their development as self-regulated learners.

Keywords: self-regulated learning (SRL), generative AI, AI-assisted educational platforms

Procedia PDF Downloads 31
1227 Health and Wellbeing: Measuring and Mapping Diversity in India

Authors: Swati Rajput

Abstract:

Wellbeing is a multifaceted concept. Its definition has evolved to become more holistic over the years. The paper attempts to build up the understanding of the concept of wellbeing and marks the trajectory of its conceptual evolution. The paper will also elaborate and analyse various indicators of socio-economic wellbeing in India at state level. Ranking method has been applied to assess the situation of each state in context to the variable selected and wellbeing as a whole. Maps have been used to depict and illustrate the same. The data shows that the socio-economic wellbeing level is higher in states of Himachal Pradesh, Jammu and Kashmir, Punjab, Uttrakhand, Uttar Pradesh, Tamil Nadu, Bihar, and Lakshadweep. The level of wellbeing is very lower in Rajasthan, Madhya Pradesh, Telengana, Andhra Pradesh, Odisha, Assam, Arunachal Pradesh, and Tripura. Environment plays an important role in maintaining health. Environment and health are important indicators of wellbeing. The paper would further analyse some indicators of environment and health and find the change in the result of wellbeing levels of different states.

Keywords: socio economic factors, wellbeing index, health, mapping

Procedia PDF Downloads 159
1226 Digital Reconstruction of Museum's Statue Using 3D Scanner for Cultural Preservation in Indonesia

Authors: Ahmad Zaini, F. Muhammad Reza Hadafi, Surya Sumpeno, Muhtadin, Mochamad Hariadi

Abstract:

The lack of information about museum’s collection reduces the number of visits of museum. Museum’s revitalization is an urgent activity to increase the number of visits. The research's roadmap is building a web-based application that visualizes museum in the virtual form including museum's statue reconstruction in the form of 3D. This paper describes implementation of three-dimensional model reconstruction method based on light-strip pattern on the museum statue using 3D scanner. Noise removal, alignment, meshing and refinement model's processes is implemented to get a better 3D object reconstruction. Model’s texture derives from surface texture mapping between object's images with reconstructed 3D model. Accuracy test of dimension of the model is measured by calculating relative error of virtual model dimension compared against the original object. The result is realistic three-dimensional model textured with relative error around 4.3% to 5.8%.

Keywords: 3D reconstruction, light pattern structure, texture mapping, museum

Procedia PDF Downloads 468
1225 Comparison of Classical Computer Vision vs. Convolutional Neural Networks Approaches for Weed Mapping in Aerial Images

Authors: Paulo Cesar Pereira Junior, Alexandre Monteiro, Rafael da Luz Ribeiro, Antonio Carlos Sobieranski, Aldo von Wangenheim

Abstract:

In this paper, we present a comparison between convolutional neural networks and classical computer vision approaches, for the specific precision agriculture problem of weed mapping on sugarcane fields aerial images. A systematic literature review was conducted to find which computer vision methods are being used on this specific problem. The most cited methods were implemented, as well as four models of convolutional neural networks. All implemented approaches were tested using the same dataset, and their results were quantitatively and qualitatively analyzed. The obtained results were compared to a human expert made ground truth for validation. The results indicate that the convolutional neural networks present better precision and generalize better than the classical models.

Keywords: convolutional neural networks, deep learning, digital image processing, precision agriculture, semantic segmentation, unmanned aerial vehicles

Procedia PDF Downloads 261
1224 Urban Noise and Air Quality: Correlation between Air and Noise Pollution; Sensors, Data Collection, Analysis and Mapping in Urban Planning

Authors: Massimiliano Condotta, Paolo Ruggeri, Chiara Scanagatta, Giovanni Borga

Abstract:

Architects and urban planners, when designing and renewing cities, have to face a complex set of problems, including the issues of noise and air pollution which are considered as hot topics (i.e., the Clean Air Act of London and the Soundscape definition). It is usually taken for granted that these problems go by together because the noise pollution present in cities is often linked to traffic and industries, and these produce air pollutants as well. Traffic congestion can create both noise pollution and air pollution, because NO₂ is mostly created from the oxidation of NO, and these two are notoriously produced by processes of combustion at high temperatures (i.e., car engines or thermal power stations). We can see the same process for industrial plants as well. What have to be investigated – and is the topic of this paper – is whether or not there really is a correlation between noise pollution and air pollution (taking into account NO₂) in urban areas. To evaluate if there is a correlation, some low-cost methodologies will be used. For noise measurements, the OpeNoise App will be installed on an Android phone. The smartphone will be positioned inside a waterproof box, to stay outdoor, with an external battery to allow it to collect data continuously. The box will have a small hole to install an external microphone, connected to the smartphone, which will be calibrated to collect the most accurate data. For air, pollution measurements will be used the AirMonitor device, an Arduino board to which the sensors, and all the other components, are plugged. After assembling the sensors, they will be coupled (one noise and one air sensor) and placed in different critical locations in the area of Mestre (Venice) to map the existing situation. The sensors will collect data for a fixed period of time to have an input for both week and weekend days, in this way it will be possible to see the changes of the situation during the week. The novelty is that data will be compared to check if there is a correlation between the two pollutants using graphs that should show the percentage of pollution instead of the values obtained with the sensors. To do so, the data will be converted to fit on a scale that goes up to 100% and will be shown thru a mapping of the measurement using GIS methods. Another relevant aspect is that this comparison can help to choose which are the right mitigation solutions to be applied in the area of the analysis because it will make it possible to solve both the noise and the air pollution problem making only one intervention. The mitigation solutions must consider not only the health aspect but also how to create a more livable space for citizens. The paper will describe in detail the methodology and the technical solution adopted for the realization of the sensors, the data collection, noise and pollution mapping and analysis.

Keywords: air quality, data analysis, data collection, NO₂, noise mapping, noise pollution, particulate matter

Procedia PDF Downloads 212
1223 Soil Degradati̇on Mapping Using Geographic Information System, Remote Sensing and Laboratory Analysis in the Oum Er Rbia High Basin, Middle Atlas, Morocco

Authors: Aafaf El Jazouli, Ahmed Barakat, Rida Khellouk

Abstract:

Mapping of soil degradation is derived from field observations, laboratory measurements, and remote sensing data, integrated quantitative methods to map the spatial characteristics of soil properties at different spatial and temporal scales to provide up-to-date information on the field. Since soil salinity, texture and organic matter play a vital role in assessing topsoil characteristics and soil quality, remote sensing can be considered an effective method for studying these properties. The main objective of this research is to asses soil degradation by combining remote sensing data and laboratory analysis. In order to achieve this goal, the required study of soil samples was taken at 50 locations in the upper basin of Oum Er Rbia in the Middle Atlas in Morocco. These samples were dried, sieved to 2 mm and analyzed in the laboratory. Landsat 8 OLI imagery was analyzed using physical or empirical methods to derive soil properties. In addition, remote sensing can serve as a supporting data source. Deterministic potential (Spline and Inverse Distance weighting) and probabilistic interpolation methods (ordinary kriging and universal kriging) were used to produce maps of each grain size class and soil properties using GIS software. As a result, a correlation was found between soil texture and soil organic matter content. This approach developed in ongoing research will improve the prospects for the use of remote sensing data for mapping soil degradation in arid and semi-arid environments.

Keywords: Soil degradation, GIS, interpolation methods (spline, IDW, kriging), Landsat 8 OLI, Oum Er Rbia high basin

Procedia PDF Downloads 165
1222 Building Transparent Supply Chains through Digital Tracing

Authors: Penina Orenstein

Abstract:

In today’s world, particularly with COVID-19 a constant worldwide threat, organizations need greater visibility over their supply chains more than ever before, in order to find areas for improvement and greater efficiency, reduce the chances of disruption and stay competitive. The concept of supply chain mapping is one where every process and route is mapped in detail between each vendor and supplier. The simplest method of mapping involves sourcing publicly available data including news and financial information concerning relationships between suppliers. An additional layer of information would be disclosed by large, direct suppliers about their production and logistics sites. While this method has the advantage of not requiring any input from suppliers, it also doesn’t allow for much transparency beyond the first supplier tier and may generate irrelevant data—noise—that must be filtered out to find the actionable data. The primary goal of this research is to build data maps of supply chains by focusing on a layered approach. Using these maps, the secondary goal is to address the question as to whether the supply chain is re-engineered to make improvements, for example, to lower the carbon footprint. Using a drill-down approach, the end result is a comprehensive map detailing the linkages between tier-one, tier-two, and tier-three suppliers super-imposed on a geographical map. The driving force behind this idea is to be able to trace individual parts to the exact site where they’re manufactured. In this way, companies can ensure sustainability practices from the production of raw materials through the finished goods. The approach allows companies to identify and anticipate vulnerabilities in their supply chain. It unlocks predictive analytics capabilities and enables them to act proactively. The research is particularly compelling because it unites network science theory with empirical data and presents the results in a visual, intuitive manner.

Keywords: data mining, supply chain, empirical research, data mapping

Procedia PDF Downloads 176
1221 High Altitude Glacier Surface Mapping in Dhauliganga Basin of Himalayan Environment Using Remote Sensing Technique

Authors: Aayushi Pandey, Manoj Kumar Pandey, Ashutosh Tiwari, Kireet Kumar

Abstract:

Glaciers play an important role in climate change and are sensitive phenomena of global climate change scenario. Glaciers in Himalayas are unique as they are predominantly valley type and are located in tropical, high altitude regions. These glaciers are often covered with debris which greatly affects ablation rate of glaciers and work as a sensitive indicator of glacier health. The aim of this study is to map high altitude Glacier surface with a focus on glacial lake and debris estimation using different techniques in Nagling glacier of dhauliganga basin in Himalayan region. Different Image Classification techniques i.e. thresholding on different band ratios and supervised classification using maximum likelihood classifier (MLC) have been used on high resolution sentinel 2A level 1c satellite imagery of 14 October 2017.Here Near Infrared (NIR)/Shortwave Infrared (SWIR) ratio image was used to extract the glaciated classes (Snow, Ice, Ice Mixed Debris) from other non-glaciated terrain classes. SWIR/BLUE Ratio Image was used to map valley rock and Debris while Green/NIR ratio image was found most suitable for mapping Glacial Lake. Accuracy assessment was performed using high resolution (3 meters) Planetscope Imagery using 60 stratified random points. The overall accuracy of MLC was 85 % while the accuracy of Band Ratios was 96.66 %. According to Band Ratio technique total areal extent of glaciated classes (Snow, Ice ,IMD) in Nagling glacier was 10.70 km2 nearly 38.07% of study area comprising of 30.87 % Snow covered area, 3.93% Ice and 3.27 % IMD covered area. Non-glaciated classes (vegetation, glacial lake, debris and valley rock) covered 61.93 % of the total area out of which valley rock is dominant with 33.83% coverage followed by debris covering 27.7 % of the area in nagling glacier. Glacial lake and Debris were accurately mapped using Band ratio technique Hence, Band Ratio approach appears to be useful for the mapping of debris covered glacier in Himalayan Region.

Keywords: band ratio, Dhauliganga basin, glacier mapping, Himalayan region, maximum likelihood classifier (MLC), Sentinel-2 satellite image

Procedia PDF Downloads 230
1220 Wearable Jacket for Game-Based Post-Stroke Arm Rehabilitation

Authors: A. Raj Kumar, A. Okunseinde, P. Raghavan, V. Kapila

Abstract:

Stroke is the leading cause of adult disability worldwide. With recent advances in immediate post-stroke care, there is an increasing number of young stroke survivors, under the age of 65 years. While most stroke survivors will regain the ability to walk, they often experience long-term arm and hand motor impairments. Long term upper limb rehabilitation is needed to restore movement and function, and prevent deterioration from complications such as learned non-use and learned bad-use. We have developed a novel virtual coach, a wearable instrumented rehabilitation jacket, to motivate individuals to participate in long-term skill re-learning, that can be personalized to their impairment profile. The jacket can estimate the movements of an individual’s arms using embedded off-the-shelf sensors (e.g., 9-DOF IMU for inertial measurements, flex-sensors for measuring angular orientation of fingers) and a Bluetooth Low Energy (BLE) powered microcontroller (e.g., RFduino) to non-intrusively extract data. The 9-DOF IMU sensors contain 3-axis accelerometer, 3-axis gyroscope, and 3-axis magnetometer to compute the quaternions, which are transmitted to a computer to compute the Euler angles and estimate the angular orientation of the arms. The data are used in a gaming environment to provide visual, and/or haptic feedback for goal-based, augmented-reality training to facilitate re-learning in a cost-effective, evidence-based manner. The full paper will elaborate the technical aspects of communication, interactive gaming environment, and physical aspects of electronics necessary to achieve our stated goal. Moreover, the paper will suggest methods to utilize the proposed system as a cheaper, portable, and versatile system vis-à-vis existing instrumentation to facilitate post-stroke personalized arm rehabilitation.

Keywords: feedback, gaming, Euler angles, rehabilitation, augmented reality

Procedia PDF Downloads 278
1219 Flood Mapping and Inoudation on Weira River Watershed (in the Case of Hadiya Zone, Shashogo Woreda)

Authors: Alilu Getahun Sulito

Abstract:

Exceptional floods are now prevalent in many places in Ethiopia, resulting in a large number of human deaths and property destruction. Lake Boyo watershed, in particular, had also traditionally been vulnerable to flash floods throughout the Boyo watershed. The goal of this research is to create flood and inundation maps for the Boyo Catchment. The integration of Geographic information system(GIS) technology and the hydraulic model (HEC-RAS) were utilized as methods to attain the objective. The peak discharge was determined using Fuller empirical methodology for intervals of 5, 10, 15, and 25 years, and the results were 103.2 m3/s, 158 m3/s, 222 m3/s, and 252 m3/s, respectively. River geometry, boundary conditions, manning's n value of varying land cover, and peak discharge at various return periods were all entered into HEC-RAS, and then an unsteady flow study was performed. The results of the unsteady flow study demonstrate that the water surface elevation in the longitudinal profile rises as the different periods increase. The flood inundation charts clearly show that regions on the right and left sides of the river with the greatest flood coverage were 15.418 km2 and 5.29 km2, respectively, flooded by 10,20,30, and 50 years. High water depths typically occur along the main channel and progressively spread to the floodplains. The latest study also found that flood-prone areas were disproportionately affected on the river's right bank. As a result, combining GIS with hydraulic modelling to create a flood inundation map is a viable solution. The findings of this study can be used to care again for the right bank of a Boyo River catchment near the Boyo Lake kebeles, according to the conclusion. Furthermore, it is critical to promote an early warning system in the kebeles so that people can be evacuated before a flood calamity happens. Keywords: Flood, Weira River, Boyo, GIS, HEC- GEORAS, HEC- RAS, Inundation Mapping

Keywords: Weira River, Boyo, GIS, HEC- GEORAS, HEC- RAS, Inundation Mapping

Procedia PDF Downloads 49
1218 A Paradigm Shift towards Personalized and Scalable Product Development and Lifecycle Management Systems in the Aerospace Industry

Authors: David E. Culler, Noah D. Anderson

Abstract:

Integrated systems for product design, manufacturing, and lifecycle management are difficult to implement and customize. Commercial software vendors, including CAD/CAM and third party PDM/PLM developers, create user interfaces and functionality that allow their products to be applied across many industries. The result is that systems become overloaded with functionality, difficult to navigate, and use terminology that is unfamiliar to engineers and production personnel. For example, manufacturers of automotive, aeronautical, electronics, and household products use similar but distinct methods and processes. Furthermore, each company tends to have their own preferred tools and programs for controlling work and information flow and that connect design, planning, and manufacturing processes to business applications. This paper presents a methodology and a case study that addresses these issues and suggests that in the future more companies will develop personalized applications that fit to the natural way that their business operates. A functioning system has been implemented at a highly competitive U.S. aerospace tooling and component supplier that works with many prominent airline manufacturers around the world including The Boeing Company, Airbus, Embraer, and Bombardier Aerospace. During the last three years, the program has produced significant benefits such as the automatic creation and management of component and assembly designs (parametric models and drawings), the extensive use of lightweight 3D data, and changes to the way projects are executed from beginning to end. CATIA (CAD/CAE/CAM) and a variety of programs developed in C#, VB.Net, HTML, and SQL make up the current system. The web-based platform is facilitating collaborative work across multiple sites around the world and improving communications with customers and suppliers. This work demonstrates that the creative use of Application Programming Interface (API) utilities, libraries, and methods is a key to automating many time-consuming tasks and linking applications together.

Keywords: PDM, PLM, collaboration, CAD/CAM, scalable systems

Procedia PDF Downloads 176
1217 Action Potential of Lateral Geniculate Neurons at Low Threshold Currents: Simulation Study

Authors: Faris Tarlochan, Siva Mahesh Tangutooru

Abstract:

Lateral Geniculate Nucleus (LGN) is the relay center in the visual pathway as it receives most of the input information from retinal ganglion cells (RGC) and sends to visual cortex. Low threshold calcium currents (IT) at the membrane are the unique indicator to characterize this firing functionality of the LGN neurons gained by the RGC input. According to the LGN functional requirements such as functional mapping of RGC to LGN, the morphologies of the LGN neurons were developed. During the neurological disorders like glaucoma, the mapping between RGC and LGN is disconnected and hence stimulating LGN electrically using deep brain electrodes can restore the functionalities of LGN. A computational model was developed for simulating the LGN neurons with three predominant morphologies, each representing different functional mapping of RGC to LGN. The firings of action potentials at LGN neuron due to IT were characterized by varying the stimulation parameters, morphological parameters and orientation. A wide range of stimulation parameters (stimulus amplitude, duration and frequency) represents the various strengths of the electrical stimulation with different morphological parameters (soma size, dendrites size and structure). The orientation (0-1800) of LGN neuron with respect to the stimulating electrode represents the angle at which the extracellular deep brain stimulation towards LGN neuron is performed. A reduced dendrite structure was used in the model using Bush–Sejnowski algorithm to decrease the computational time while conserving its input resistance and total surface area. The major finding is that an input potential of 0.4 V is required to produce the action potential in the LGN neuron which is placed at 100 µm distance from the electrode. From this study, it can be concluded that the neuroprostheses under design would need to consider the capability of inducing at least 0.4V to produce action potentials in LGN.

Keywords: Lateral Geniculate Nucleus, visual cortex, finite element, glaucoma, neuroprostheses

Procedia PDF Downloads 279
1216 Gnss Aided Photogrammetry for Digital Mapping

Authors: Muhammad Usman Akram

Abstract:

This research work based on GNSS-Aided Photogrammetry for Digital Mapping. It focuses on topographic survey of an area or site which is to be used in future Planning & development (P&D) or can be used for further, examination, exploration, research and inspection. Survey and Mapping in hard-to-access and hazardous areas are very difficult by using traditional techniques and methodologies; as well it is time consuming, labor intensive and has less precision with limited data. In comparison with the advance techniques it is saving with less manpower and provides more precise output with a wide variety of multiple data sets. In this experimentation, Aerial Photogrammetry technique is used where an UAV flies over an area and captures geocoded images and makes a Three-Dimensional Model (3-D Model), UAV operates on a user specified path or area with various parameters; Flight altitude, Ground sampling distance (GSD), Image overlapping, Camera angle etc. For ground controlling, a network of points on the ground would be observed as a Ground Control point (GCP) using Differential Global Positioning System (DGPS) in PPK or RTK mode. Furthermore, that raw data collected by UAV and DGPS will be processed in various Digital image processing programs and Computer Aided Design software. From which as an output we obtain Points Dense Cloud, Digital Elevation Model (DEM) and Ortho-photo. The imagery is converted into geospatial data by digitizing over Ortho-photo, DEM is further converted into Digital Terrain Model (DTM) for contour generation or digital surface. As a result, we get Digital Map of area to be surveyed. In conclusion, we compared processed data with exact measurements taken on site. The error will be accepted if the amount of error is not breached from survey accuracy limits set by concerned institutions.

Keywords: photogrammetry, post processing kinematics, real time kinematics, manual data inquiry

Procedia PDF Downloads 33
1215 Mapping Contested Sites - Permanence Of The Temporary Mouttalos Case Study

Authors: M. Hadjisoteriou, A. Kyriacou Petrou

Abstract:

This paper will discuss ideas of social sustainability in urban design and human behavior in multicultural contested sites. It will focus on the potential of the re-reading of the “site” through mapping that acts as a research methodology and will discuss the chosen site of Mouttalos, Cyprus as a place of multiple identities. Through a methodology of mapping using a bottom up approach, a process of disassembling derives that acts as a mechanism to re-examine space and place by searching for the invisible and the non-measurable, understanding the site through its detailed inhabitation patterns. The significance of this study lies in the use of mapping as an active form of thinking rather than a passive process of representation that allows for a new site to be discovered, giving multiple opportunities for adaptive urban strategies and socially engaged design approaches. We will discuss the above thematic based on the chosen contested site of Mouttalos, a small Turkish Cypriot neighbourhood, in the old centre of Paphos (Ktima), SW of Cyprus. During the political unrest, between Greek and Turkish Cypriot communities, in 1963, the area became an enclave to the Turkish Cypriots, excluding any contact with the rest of the area. Following the Turkish invasion of 1974, the residents left their homes, plots and workplaces, resettling in the North of Cyprus. Greek Cypriot refugees moved into the area. The presence of the Greek Cypriot refugees is still considered to be a temporary resettlement. The buildings and the residents themselves exist in a state of uncertainty. The site is documented through a series of parallel investigations into the physical conditions and history of the site. Research methodologies use the process of mapping to expose the complex and often invisible layers of information that coexist. By registering the site through the subjective experiences, and everyday stories of inhabitants, a series of cartographic recordings reveals the space between: happening and narrative and especially space between different cultures and religions. Research put specific emphasis on engaging the public, promoting social interaction, identifying spatial patterns of occupation by previous inhabitants through social media. Findings exposed three main areas of interest. Firstly we identified inter-dependent relationships between permanence and temporality, characterised by elements such us, signage through layers of time, past events and periodical street festivals, unfolding memory and belonging. Secondly issues of co-ownership and occupation, found through particular narratives of exchange between the two communities and through appropriation of space. Finally formal and informal inhabitation of space, revealed through the presence of informal shared back yards, alternative paths, porous street edges and formal and informal landmarks. The importance of the above findings, was achieving a shift of focus from the built infrastructure to the soft network of multiple and complex relations of dependence and autonomy. Proposed interventions for this contested site were informed and led by a new multicultural identity where invisible qualities were revealed though the process of mapping, taking on issues of layers of time, formal and informal inhabitation and the “permanence of the temporary”.

Keywords: contested sites, mapping, social sustainability, temporary urban strategies

Procedia PDF Downloads 423
1214 Sentiment Mapping through Social Media and Its Implications

Authors: G. C. Joshi, M. Paul, B. K. Kalita, V. Ranga, J. S. Rawat, P. S. Rawat

Abstract:

Being a habitat of the global village, every place has established connection through the strength and power of social media piercing through the political boundaries. Social media is a digital platform, where people across the world can interact as it has advantages of being universal, anonymous, easily accessible, indirect interaction, gathering and sharing information. The power of social media lies in the intensity of sharing extreme opinions or feelings, in contrast to the personal interactions which can be easily mapped in the form of Sentiment Mapping. The easy access to social networking sites such as Facebook, Twitter and blogs made unprecedented opportunities for citizens to voice their opinions loaded with dynamics of emotions. These further influence human thoughts where social media plays a very active role. A recent incident of public importance was selected as a case study to map the sentiments of people through Twitter. Understanding those dynamics through the eye of an ordinary people can be challenging. With the help of R-programming language and by the aid of GIS techniques sentiment maps has been produced. The emotions flowing worldwide in the form of tweets were extracted and analyzed. The number of tweets had diminished by 91 % from 25/08/2017 to 31/08/2017. A boom of sentiments emerged near the origin of the case, i.e., Delhi, Haryana and Punjab and the capital showed maximum influence resulting in spillover effect near Delhi. The trend of sentiments was prevailing more as neutral (45.37%), negative (28.6%) and positive (21.6%) after calculating the sentiment scores of the tweets. The result can be used to know the spatial distribution of digital penetration in India, where highest concentration lies in Mumbai and lowest in North East India and Jammu and Kashmir.

Keywords: sentiment mapping, digital literacy, GIS, R statistical language, spatio-temporal

Procedia PDF Downloads 152
1213 Mapping the Turbulence Intensity and Excess Energy Available to Small Wind Systems over 4 Major UK Cities

Authors: Francis C. Emejeamara, Alison S. Tomlin, James Gooding

Abstract:

Due to the highly turbulent nature of urban air flows, and by virtue of the fact that turbines are likely to be located within the roughness sublayer of the urban boundary layer, proposed urban wind installations are faced with major challenges compared to rural installations. The challenge of operating within turbulent winds can however, be counteracted by the development of suitable gust tracking solutions. In order to assess the cost effectiveness of such controls, a detailed understanding of the urban wind resource, including its turbulent characteristics, is required. Estimating the ambient turbulence and total kinetic energy available at different control response times is essential in evaluating the potential performance of wind systems within the urban environment should effective control solutions be employed. However, high resolution wind measurements within the urban roughness sub-layer are uncommon, and detailed CFD modelling approaches are too computationally expensive to apply routinely on a city wide scale. This paper therefore presents an alternative semi-empirical methodology for estimating the excess energy content (EEC) present in the complex and gusty urban wind. An analytical methodology for predicting the total wind energy available at a potential turbine site is proposed by assessing the relationship between turbulence intensities and EEC, for different control response times. The semi-empirical model is then incorporated with an analytical methodology that was initially developed to predict mean wind speeds at various heights within the built environment based on detailed mapping of its aerodynamic characteristics. Based on the current methodology, additional estimates of turbulence intensities and EEC allow a more complete assessment of the available wind resource. The methodology is applied to 4 UK cities with results showing the potential of mapping turbulence intensities and the total wind energy available at different heights within each city. Considering the effect of ambient turbulence and choice of wind system, the wind resource over neighbourhood regions (of 250 m uniform resolution) and building rooftops within the 4 cities were assessed with results highlighting the promise of mapping potential turbine sites within each city.

Keywords: excess energy content, small-scale wind, turbulence intensity, urban wind energy, wind resource assessment

Procedia PDF Downloads 475
1212 Combining in vitro Protein Expression with AlphaLISA Technology to Study Protein-Protein Interaction

Authors: Shayli Varasteh Moradi, Wayne A. Johnston, Dejan Gagoski, Kirill Alexandrov

Abstract:

The demand for a rapid and more efficient technique to identify protein-protein interaction particularly in the areas of therapeutics and diagnostics development is growing. The method described here is a rapid in vitro protein-protein interaction analysis approach based on AlphaLISA technology combined with Leishmania tarentolae cell-free protein production (LTE) system. Cell-free protein synthesis allows the rapid production of recombinant proteins in a multiplexed format. Among available in vitro expression systems, LTE offers several advantages over other eukaryotic cell-free systems. It is based on a fast growing fermentable organism that is inexpensive in cultivation and lysate production. High integrity of proteins produced in this system and the ability to co-express multiple proteins makes it a desirable method for screening protein interactions. Following the translation of protein pairs in LTE system, the physical interaction between proteins of interests is analysed by AlphaLISA assay. The assay is performed using unpurified in vitro translation reaction and therefore can be readily multiplexed. This approach can be used in various research applications such as epitope mapping, antigen-antibody analysis and protein interaction network mapping. The intra-viral protein interaction network of Zika virus was studied using the developed technique. The viral proteins were co-expressed pair-wise in LTE and all possible interactions among viral proteins were tested using AlphaLISA. The assay resulted to the identification of 54 intra-viral protein-protein interactions from which 19 binary interactions were found to be novel. The presented technique provides a powerful tool for rapid analysis of protein-protein interaction with high sensitivity and throughput.

Keywords: AlphaLISA technology, cell-free protein expression, epitope mapping, Leishmania tarentolae, protein-protein interaction

Procedia PDF Downloads 239
1211 3D Electromagnetic Mapping of the Signal Strength in Long Term Evolution Technology in the Livestock Department of ESPOCH

Authors: Cinthia Campoverde, Mateo Benavidez, Victor Arias, Milton Torres

Abstract:

This article focuses on the 3D electromagnetic mapping of the intensity of the signal received by a mobile antenna within the open areas of the Department of Livestock of the Escuela Superior Politecnica de Chimborazo (ESPOCH), located in the city of Riobamba, Ecuador. The transmitting antenna belongs to the mobile telephone company ”TUENTI”, and is analyzed in the 2 GHz bands, operating at a frequency of 1940 MHz, using Long Term Evolution (LTE). Power signal strength data in the area were measured empirically using the ”Network Cell Info” application. A total of 170 samples were collected, distributed in 19 concentric circles around the base station. 3 campaigns were carried out at the same time, with similar traffic, and average values were obtained at each point, which varies between -65.33 dBm to -101.67 dBm. Also, the two virtualization software used are Sketchup and Unreal. Finally, the virtualized environment was visualized through virtual reality using Oculus 3D glasses, where the power levels are displayed according to a range of powers.

Keywords: reception power, LTE technology, virtualization, virtual reality, power levels

Procedia PDF Downloads 93
1210 Climate Change and Landslide Risk Assessment in Thailand

Authors: Shotiros Protong

Abstract:

The incidents of sudden landslides in Thailand during the past decade have occurred frequently and more severely. It is necessary to focus on the principal parameters used for analysis such as land cover land use, rainfall values, characteristic of soil and digital elevation model (DEM). The combination of intense rainfall and severe monsoons is increasing due to global climate change. Landslide occurrences rapidly increase during intense rainfall especially in the rainy season in Thailand which usually starts around mid-May and ends in the middle of October. The rain-triggered landslide hazard analysis is the focus of this research. The combination of geotechnical and hydrological data are used to determine permeability, conductivity, bedding orientation, overburden and presence of loose blocks. The regional landslide hazard mapping is developed using the Slope Stability Index SINMAP model supported on Arc GIS software version 10.1. Geological and land use data are used to define the probability of landslide occurrences in terms of geotechnical data. The geological data can indicate the shear strength and the angle of friction values for soils above given rock types, which leads to the general applicability of the approach for landslide hazard analysis. To address the research objectives, the methods are described in this study: setup and calibration of the SINMAP model, sensitivity of the SINMAP model, geotechnical laboratory, landslide assessment at present calibration and landslide assessment under future climate simulation scenario A2 and B2. In terms of hydrological data, the millimetres/twenty-four hours of average rainfall data are used to assess the rain triggered landslide hazard analysis in slope stability mapping. During 1954-2012 period, is used for the baseline of rainfall data at the present calibration. The climate change in Thailand, the future of climate scenarios are simulated by spatial and temporal scales. The precipitation impact is need to predict for the climate future, Statistical Downscaling Model (SDSM) version 4.2, is used to assess the simulation scenario of future change between latitude 16o 26’ and 18o 37’ north and between longitude 98o 52’ and 103o 05’ east by SDSM software. The research allows the mapping of risk parameters for landslide dynamics, and indicates the spatial and time trends of landslide occurrences. Thus, regional landslide hazard mapping under present-day climatic conditions from 1954 to 2012 and simulations of climate change based on GCM scenarios A2 and B2 from 2013 to 2099 related to the threshold rainfall values for the selected the study area in Uttaradit province in the northern part of Thailand. Finally, the landslide hazard mapping will be compared and shown by areas (km2 ) in both the present and the future under climate simulation scenarios A2 and B2 in Uttaradit province.

Keywords: landslide hazard, GIS, slope stability index (SINMAP), landslides, Thailand

Procedia PDF Downloads 564
1209 Description of Decision Inconsistency in Intertemporal Choices and Representation of Impatience as a Reflection of Irrationality: Consequences in the Field of Personalized Behavioral Finance

Authors: Roberta Martino, Viviana Ventre

Abstract:

Empirical evidence has, over time, confirmed that the behavior of individuals is inconsistent with the descriptions provided by the Discounted Utility Model, an essential reference for calculating the utility of intertemporal prospects. The model assumes that individuals calculate the utility of intertemporal prospectuses by adding up the values of all outcomes obtained by multiplying the cardinal utility of the outcome by the discount function estimated at the time the outcome is received. The trend of the discount function is crucial for the preferences of the decision maker because it represents the perception of the future, and its trend causes temporally consistent or temporally inconsistent preferences. In particular, because different formulations of the discount function lead to various conclusions in predicting choice, the descriptive ability of models with a hyperbolic trend is greater than linear or exponential models. Suboptimal choices from any time point of view are the consequence of this mechanism, the psychological factors of which are encapsulated in the discount rate trend. In addition, analyzing the decision-making process from a psychological perspective, there is an equivalence between the selection of dominated prospects and a degree of impatience that decreases over time. The first part of the paper describes and investigates the anomalies of the discounted utility model by relating the cognitive distortions of the decision-maker to the emotional factors that are generated during the evaluation and selection of alternatives. Specifically, by studying the degree to which impatience decreases, it’s possible to quantify how the psychological and emotional mechanisms of the decision-maker result in a lack of decision persistence. In addition, this description presents inconsistency as the consequence of an inconsistent attitude towards time-delayed choices. The second part of the paper presents an experimental phase in which we show the relationship between inconsistency and impatience in different contexts. Analysis of the degree to which impatience decreases confirms the influence of the decision maker's emotional impulses for each anomaly in the utility model discussed in the first part of the paper. This work provides an application in the field of personalized behavioral finance. Indeed, the numerous behavioral diversities, evident even in the degrees of decrease in impatience in the experimental phase, support the idea that optimal strategies may not satisfy individuals in the same way. With the aim of homogenizing the categories of investors and to provide a personalized approach to advice, the results proven in the experimental phase are used in a complementary way with the information in the field of behavioral finance to implement the Analytical Hierarchy Process model in intertemporal choices, useful for strategic personalization. In the construction of the Analytic Hierarchy Process, the degree of decrease in impatience is understood as reflecting irrationality in decision-making and is therefore used for the construction of weights between anomalies and behavioral traits.

Keywords: analytic hierarchy process, behavioral finance, financial anomalies, impatience, time inconsistency

Procedia PDF Downloads 68
1208 Smart Lean Manufacturing in the Context of Industry 4.0: A Case Study

Authors: M. Ramadan, B. Salah

Abstract:

This paper introduces a framework to digitalize lean manufacturing tools to enhance smart lean-based manufacturing environments or Lean 4.0 manufacturing systems. The paper discusses the integration between lean tools and the powerful features of recent real-time data capturing systems with the help of Information and Communication Technologies (ICT) to develop an intelligent real-time monitoring and controlling system of production operations concerning lean targets. This integration is represented in the Lean 4.0 system called Dynamic Value Stream Mapping (DVSM). Moreover, the paper introduces the practice of Radio Frequency Identification (RFID) and ICT to smartly support lean tools and practices during daily production runs to keep the lean system alive and effective. This work introduces a practical description of how the lean method tools 5S, standardized work, and poka-yoke can be digitalized and smartly monitored and controlled through DVSM. A framework of the three tools has been discussed and put into practice in a German switchgear manufacturer.

Keywords: lean manufacturing, Industry 4.0, radio frequency identification, value stream mapping

Procedia PDF Downloads 229
1207 Incorporating Adult Learners’ Interests into Learning Styles: Enhancing Education for Lifelong Learners

Authors: Christie DeGregorio

Abstract:

In today's rapidly evolving educational landscape, adult learners are becoming an increasingly significant demographic. These individuals often possess a wealth of life experiences and diverse interests that can greatly influence their learning styles. Recognizing and incorporating these interests into educational practices can lead to enhanced engagement, motivation, and overall learning outcomes for adult learners. This essay aims to explore the significance of incorporating adult learners' interests into learning styles and provide an overview of the methodologies used in related studies. When investigating the incorporation of adult learners' interests into learning styles, researchers have employed various methodologies to gather valuable insights. These methodologies include surveys, interviews, case studies, and classroom observations. Surveys and interviews allow researchers to collect self-reported data directly from adult learners, providing valuable insights into their interests, preferences, and learning styles. Case studies offer an in-depth exploration of individual adult learners, highlighting how their interests can be integrated into personalized learning experiences. Classroom observations provide researchers with a firsthand understanding of the dynamics between adult learners' interests and their engagement within a learning environment. The major findings from studies exploring the incorporation of adult learners' interests into learning styles reveal the transformative impact of this approach. Firstly, aligning educational content with adult learners' interests increases their motivation and engagement in the learning process. By connecting new knowledge and skills to topics they are passionate about, adult learners become active participants in their own education. Secondly, integrating interests into learning styles fosters a sense of relevance and applicability. Adult learners can see the direct connection between the knowledge they acquire and its real-world applications, which enhances their ability to transfer learning to various contexts. Lastly, personalized learning experiences tailored to individual interests enable adult learners to take ownership of their educational journey, promoting lifelong learning habits and self-directedness.

Keywords: integration, personalization, transferability, learning style

Procedia PDF Downloads 75
1206 The Making of a Community: Perception versus Reality of Neighborhood Resources

Authors: Kirstie Smith

Abstract:

This paper elucidates the value of neighborhood perception as it contributes to the advancement of well-being for individuals and families within a neighborhood. Through in-depth interviews with city residents, this paper examines the degree to which key stakeholders’ (residents) evaluate their neighborhood and perception of resources and identify, access, and utilize local assets existing in the community. Additionally, the research objective included conducting a community inventory that qualified the community assets and resources of lower-income neighborhoods of a medium-sized industrial city. Analysis of the community’s assets was compared with the interview results to allow for a better understanding of the community’s condition. Community mapping revealed the key informants’ reflections of assets were somewhat validated. In each neighborhood, there were more assets mapped than reported in the interviews. Another chief supposition drawn from this study was the identification of key development partners and social networks that offer the potential to facilitate locally-driven community development. Overall, the participants provided invaluable local knowledge of the perception of neighborhood assets, the well-being of residents, the condition of the community, and suggestions for responding to the challenges of the entire community in order to mobilize the present assets and networks.

Keywords: community mapping, family, resource allocation, social networks

Procedia PDF Downloads 354
1205 Real-Time Spatial Mapping of Metal Contamination in Environmental Waters for Sustainable Ecological Monitoring Using a Portable X-Ray Fluorescence Device

Authors: Mikhail Sandzhiev

Abstract:

The monitoring of metal pollution in environmental waters is crucial for the protection of ecosystems, human health and agricultural activities. Traditional laboratory-based metal analysis methods are time-consuming and expensive, which often leads to delays in the availability of information. This study presents an approach to real-time water quality monitoring using portable X-ray fluorescence (p-XRF) technology coupled with geographic information systems (GIS). Using a custom Python script, p-XRF data is processed and formatted into a GIS-compatible format, facilitating spatial visualization of metal concentrations in ǪGIS. Field-usable filters, especially bisphosphonate-functionalized thermally carbonized porous silicon (BP-TCPSi), preformed metals such as Mn, Ni, Cu, Zn, and Pb allow direct detection in the field by using p-XRF. Key objectives include robust data collection, spatial visualization and validation processes to ensure accuracy and efficiency. This provides quick and efficient insights into metal contamination trends and allows proactive decision-making.

Keywords: metal concentrations, predictive mapping, environmental monitoring, environmental waters

Procedia PDF Downloads 2
1204 Hydrographic Mapping Based on the Concept of Fluvial-Geomorphological Auto-Classification

Authors: Jesús Horacio, Alfredo Ollero, Víctor Bouzas-Blanco, Augusto Pérez-Alberti

Abstract:

Rivers have traditionally been classified, assessed and managed in terms of hydrological, chemical and / or biological criteria. Geomorphological classifications had in the past a secondary role, although proposals like River Styles Framework, Catchment Baseline Survey or Stroud Rural Sustainable Drainage Project did incorporate geomorphology for management decision-making. In recent years many studies have been attracted to the geomorphological component. The geomorphological processes and their associated forms determine the structure of a river system. Understanding these processes and forms is a critical component of the sustainable rehabilitation of aquatic ecosystems. The fluvial auto-classification approach suggests that a river is a self-built natural system, with processes and forms designed to effectively preserve their ecological function (hydrologic, sedimentological and biological regime). Fluvial systems are formed by a wide range of elements with multiple non-linear interactions on different spatial and temporal scales. Besides, the fluvial auto-classification concept is built using data from the river itself, so that each classification developed is peculiar to the river studied. The variables used in the classification are specific stream power and mean grain size. A discriminant analysis showed that these variables are the best characterized processes and forms. The statistical technique applied allows to get an individual discriminant equation for each geomorphological type. The geomorphological classification was developed using sites with high naturalness. Each site is a control point of high ecological and geomorphological quality. The changes in the conditions of the control points will be quickly recognizable, and easy to apply a right management measures to recover the geomorphological type. The study focused on Galicia (NW Spain) and the mapping was made analyzing 122 control points (sites) distributed over eight river basins. In sum, this study provides a method for fluvial geomorphological classification that works as an open and flexible tool underlying the fluvial auto-classification concept. The hydrographic mapping is the visual expression of the results, such that each river has a particular map according to its geomorphological characteristics. Each geomorphological type is represented by a particular type of hydraulic geometry (channel width, width-depth ratio, hydraulic radius, etc.). An alteration of this geometry is indicative of a geomorphological disturbance (whether natural or anthropogenic). Hydrographic mapping is also dynamic because its meaning changes if there is a modification in the specific stream power and/or the mean grain size, that is, in the value of their equations. The researcher has to check annually some of the control points. This procedure allows to monitor the geomorphology quality of the rivers and to see if there are any alterations. The maps are useful to researchers and managers, especially for conservation work and river restoration.

Keywords: fluvial auto-classification concept, mapping, geomorphology, river

Procedia PDF Downloads 367
1203 A Method for Compression of Short Unicode Strings

Authors: Masoud Abedi, Abbas Malekpour, Peter Luksch, Mohammad Reza Mojtabaei

Abstract:

The use of short texts in communication has been greatly increasing in recent years. Applying different languages in short texts has led to compulsory use of Unicode strings. These strings need twice the space of common strings, hence, applying algorithms of compression for the purpose of accelerating transmission and reducing cost is worthwhile. Nevertheless, other compression methods like gzip, bzip2 or PAQ due to high overhead data size are not appropriate. The Huffman algorithm is one of the rare algorithms effective in reducing the size of short Unicode strings. In this paper, an algorithm is proposed for compression of very short Unicode strings. At first, every new character to be sent to a destination is inserted in the proposed mapping table. At the beginning, every character is new. In case the character is repeated for the same destination, it is not considered as a new character. Next, the new characters together with the mapping value of repeated characters are arranged through a specific technique and specially formatted to be transmitted. The results obtained from an assessment made on a set of short Persian and Arabic strings indicate that this proposed algorithm outperforms the Huffman algorithm in size reduction.

Keywords: Algorithms, Data Compression, Decoding, Encoding, Huffman Codes, Text Communication

Procedia PDF Downloads 349
1202 Accelerating Personalization Using Digital Tools to Drive Circular Fashion

Authors: Shamini Dhana, G. Subrahmanya VRK Rao

Abstract:

The fashion industry is advancing towards a mindset of zero waste, personalization, creativity, and circularity. The trend of upcycling clothing and materials into personalized fashion is being demanded by the next generation. There is a need for a digital tool to accelerate the process towards mass customization. Dhana’s D/Sphere fashion technology platform uses digital tools to accelerate upcycling. In essence, advanced fashion garments can be designed and developed via reuse, repurposing, recreating activities, and using existing fabric and circulating materials. The D/Sphere platform has the following objectives: to provide (1) An opportunity to develop modern fashion using existing, finished materials and clothing without chemicals or water consumption; (2) The potential for an everyday customer and designer to use the medium of fashion for creative expression; (3) A solution to address the global textile waste generated by pre- and post-consumer fashion; (4) A solution to reduce carbon emissions, water, and energy consumption with the participation of all stakeholders; (5) An opportunity for brands, manufacturers, retailers to work towards zero-waste designs and as an alternative revenue stream. Other benefits of this alternative approach include sustainability metrics, trend prediction, facilitation of disassembly and remanufacture deep learning, and hyperheuristics for high accuracy. A design tool for mass personalization and customization utilizing existing circulating materials and deadstock, targeted to fashion stakeholders will lower environmental costs, increase revenues through up to date upcycled apparel, produce less textile waste during the cut-sew-stitch process, and provide a real design solution for the end customer to be part of circular fashion. The broader impact of this technology will result in a different mindset to circular fashion, increase the value of the product through multiple life cycles, find alternatives towards zero waste, and reduce the textile waste that ends up in landfills. This technology platform will be of interest to brands and companies that have the responsibility to reduce their environmental impact and contribution to climate change as it pertains to the fashion and apparel industry. Today, over 70% of the $3 trillion fashion and apparel industry ends up in landfills. To this extent, the industry needs such alternative techniques to both address global textile waste as well as provide an opportunity to include all stakeholders and drive circular fashion with new personalized products. This type of modern systems thinking is currently being explored around the world by the private sector, organizations, research institutions, and governments. This technological innovation using digital tools has the potential to revolutionize the way we look at communication, capabilities, and collaborative opportunities amongst stakeholders in the development of new personalized and customized products, as well as its positive impacts on society, our environment, and global climate change.

Keywords: circular fashion, deep learning, digital technology platform, personalization

Procedia PDF Downloads 66
1201 A Robust Visual Simultaneous Localization and Mapping for Indoor Dynamic Environment

Authors: Xiang Zhang, Daohong Yang, Ziyuan Wu, Lei Li, Wanting Zhou

Abstract:

Visual Simultaneous Localization and Mapping (VSLAM) uses cameras to collect information in unknown environments to realize simultaneous localization and environment map construction, which has a wide range of applications in autonomous driving, virtual reality and other related fields. At present, the related research achievements about VSLAM can maintain high accuracy in static environment. But in dynamic environment, due to the presence of moving objects in the scene, the movement of these objects will reduce the stability of VSLAM system, resulting in inaccurate localization and mapping, or even failure. In this paper, a robust VSLAM method was proposed to effectively deal with the problem in dynamic environment. We proposed a dynamic region removal scheme based on semantic segmentation neural networks and geometric constraints. Firstly, semantic extraction neural network is used to extract prior active motion region, prior static region and prior passive motion region in the environment. Then, the light weight frame tracking module initializes the transform pose between the previous frame and the current frame on the prior static region. A motion consistency detection module based on multi-view geometry and scene flow is used to divide the environment into static region and dynamic region. Thus, the dynamic object region was successfully eliminated. Finally, only the static region is used for tracking thread. Our research is based on the ORBSLAM3 system, which is one of the most effective VSLAM systems available. We evaluated our method on the TUM RGB-D benchmark and the results demonstrate that the proposed VSLAM method improves the accuracy of the original ORBSLAM3 by 70%˜98.5% under high dynamic environment.

Keywords: dynamic scene, dynamic visual SLAM, semantic segmentation, scene flow, VSLAM

Procedia PDF Downloads 118