Search results for: third-level digital divide
135 The Rite of Jihadification in ISIS Modified Video Games: Mass Deception and Dialectic of Religious Regression in Technological Progression
Authors: Venus Torabi
Abstract:
ISIS, the terrorist organization, modified two videogames, ARMA III and Grand Theft Auto 5 (2013) as means of online recruitment and ideological propaganda. The urge to study the mechanism at work, whether it has been successful or not, derives (Digital) Humanities experts to explore how codes of terror, Islamic ideology and recruitment strategies are incorporated into the ludic mechanics of videogames. Another aspect of the significance lies in the fact that this is a latent problem that has not been fully addressed in an interdisciplinary framework prior to this study, to the best of the researcher’s knowledge. Therefore, due to the complexity of the subject, the present paper entangles with game studies, philosophical and religious poles to form the methodology of conducting the research. As a contextualized epistemology of such exploitation of videogames, the core argument is building on the notion of “Culture Industry” proposed by Theodore W. Adorno and Max Horkheimer in Dialectic of Enlightenment (2002). This article posits that the ideological underpinnings of ISIS’s cause corroborated by the action-bound mechanics of the videogames are in line with adhering to the Islamic Eschatology as a furnishing ground and an excuse in exercising terrorism. It is an account of ISIS’s modification of the videogames, a tool of technological progression to practice online radicalization. Dialectically, this practice is packed up in rhetoric for recognizing a religious myth (the advent of a savior), as a hallmark of regression. The study puts forth that ISIS’s wreaking havoc on the world, both in reality and within action videogames, is negotiating the process of self-assertion in the players of such videogames (by assuming one’s self a member of terrorists) that leads to self-annihilation. It tries to unfold how ludic Mod videogames are misused as tools of mass deception towards ethnic cleansing in reality and line with the distorted Eschatological myth. To conclude, this study posits videogames to be a new avenue of mass deception in the framework of the Culture Industry. Yet, this emerges as a two-edged sword of mass deception in ISIS’s modification of videogames. It shows that ISIS is not only trying to hijack the minds through online/ludic recruitment, it potentially deceives the Muslim communities or those prone to radicalization into believing that it's terrorist practices are preparing the world for the advent of a religious savior based on Islamic Eschatology. This is to claim that the harsh actions of the videogames are potentially breeding minds by seeds of terrorist propaganda and numbing them to violence. The real world becomes an extension of that harsh virtual environment in a ludic/actual continuum, the extension that is contributing to the mass deception mechanism of the terrorists, in a clandestine trend.Keywords: culture industry, dialectic, ISIS, islamic eschatology, mass deception, video games
Procedia PDF Downloads 137134 Consumer Cognitive Models of Vaccine Attitudes: Behavioral Informed Strategies Promoting Vaccination Policy in Greece
Authors: Halkiopoulos Constantinos, Koutsopoulou Ioanna, Gkintoni Evgenia, Antonopoulou Hera
Abstract:
Immunization appears to be an essential part of health care service in times of pandemics such as covid-19 and aims not only to protect the health of the population but also the health and sustainability of the economies of the countries affected. It is reported that more than 3.44 billion doses have been administered so far, which accounts for 45 doses for 100 people. Vaccination programs in various countries have been promoted and accepted by people differently and therefore they proceeded in different ways and speed; most countries directing them towards people with vulnerable chronic or recent health statuses. Large scale restriction measures or lockdown, personal protection measures such as masks and gloves and a decrease in leisure and sports activities were also implemented around the world as part of the protection health strategies against the covid-19 pandemic. This research aims to present an analysis based on variations on people’s attitudes towards vaccination based on demographic, social and epidemiological characteristics, and health status on the one hand and perception of health, health satisfaction, pain, and quality of life on the other hand. 1500 Greek e-consumers participated in the research, mainly through social media who took part in an online-based survey voluntarily. The questionnaires included demographic, social and medical characteristics of the participants, and questions asking people’s willingness to be vaccinated and their opinion on whether there should be a vaccine against covid-19. Other stressor factors were also reported in the questionnaires and participants’ loss of someone close due to covid-19, or staying at home quarantine due to being infected from covid-19. WHOQUOL-BREF and GLOBAL PSYCHOTRAUMA SCREEN- GPS were used with kind permission from WHO and from the International Society for Traumatic Stress Studies in this study. Attitudes towards vaccination varied significantly related to aging, level of education, health status and consumer behavior. Health professionals’ attitudes also varied in relation to age, level of education, profession, health status and consumer needs. Vaccines have been the most common technological aid of human civilization so far in the fight against viruses. The results of this study can be used for health managers and digital marketers of pharmaceutical companies and also other staff involved in vaccination programs and for designing health policy immunization strategies during pandemics in order to achieve positive attitudes towards vaccination and larger populations being vaccinated in shorter periods of time after the break out of pandemic. Health staff needs to be trained, aided and supervised to go through with vaccination programs and to be protected through vaccination programs themselves. Feedback in each country’s vaccination program, short backs, deficiencies and delays should be addressed and worked out.Keywords: consumer behavior, cognitive models, vaccination policy, pandemic, Covid-19, Greece
Procedia PDF Downloads 185133 The Influence of Microsilica on the Cluster Cracks' Geometry of Cement Paste
Authors: Maciej Szeląg
Abstract:
The changing nature of environmental impacts, in which cement composites are operating, are causing in the structure of the material a number of phenomena, which result in volume deformation of the composite. These strains can cause composite cracking. Cracks are merging by propagation or intersect to form a characteristic structure of cracks known as the cluster cracks. This characteristic mesh of cracks is crucial to almost all building materials, which are working in service loads conditions. Particularly dangerous for a cement matrix is a sudden load of elevated temperature – the thermal shock. Resulting in a relatively short period of time a large value of a temperature gradient between the outer surface and the material’s interior can result in cracks formation on the surface and in the volume of the material. In the paper, in order to analyze the geometry of the cluster cracks of the cement pastes, the image analysis tools were used. Tested were 4 series of specimens made of two different Portland cement. In addition, two series include microsilica as a substitute for the 10% of the cement. Within each series, specimens were performed in three w/b indicators (water/binder): 0.4; 0.5; 0.6. The cluster cracks were created by sudden loading the samples by elevated temperature of 250°C. Images of the cracked surfaces were obtained via scanning at 2400 DPI. Digital processing and measurements were performed using ImageJ v. 1.46r software. To describe the structure of the cluster cracks three stereological parameters were proposed: the average cluster area - A ̅, the average length of cluster perimeter - L ̅, and the average opening width of a crack between clusters - I ̅. The aim of the study was to identify and evaluate the relationships between measured stereological parameters, and the compressive strength and the bulk density of the modified cement pastes. The tests of the mechanical and physical feature have been carried out in accordance with EN standards. The curves describing the relationships have been developed using the least squares method, and the quality of the curve fitting to the empirical data was evaluated using three diagnostic statistics: the coefficient of determination – R2, the standard error of estimation - Se, and the coefficient of random variation – W. The use of image analysis allowed for a quantitative description of the cluster cracks’ geometry. Based on the obtained results, it was found a strong correlation between the A ̅ and L ̅ – reflecting the fractal nature of the cluster cracks formation process. It was noted that the compressive strength and the bulk density of cement pastes decrease with an increase in the values of the stereological parameters. It was also found that the main factors, which impact on the cluster cracks’ geometry are the cement particles’ size and the general content of the binder in a volume of the material. The microsilica caused the reduction in the A ̅, L ̅ and I ̅ values compared to the values obtained by the classical cement paste’s samples, which is caused by the pozzolanic properties of the microsilica.Keywords: cement paste, cluster cracks, elevated temperature, image analysis, microsilica, stereological parameters
Procedia PDF Downloads 246132 Analyzing Data Protection in the Era of Big Data under the Framework of Virtual Property Layer Theory
Authors: Xiaochen Mu
Abstract:
Data rights confirmation, as a key legal issue in the development of the digital economy, is undergoing a transition from a traditional rights paradigm to a more complex private-economic paradigm. In this process, data rights confirmation has evolved from a simple claim of rights to a complex structure encompassing multiple dimensions of personality rights and property rights. Current data rights confirmation practices are primarily reflected in two models: holistic rights confirmation and process rights confirmation. The holistic rights confirmation model continues the traditional "one object, one right" theory, while the process rights confirmation model, through contractual relationships in the data processing process, recognizes rights that are more adaptable to the needs of data circulation and value release. In the design of the data property rights system, there is a hierarchical characteristic aimed at decoupling from raw data to data applications through horizontal stratification and vertical staging. This design not only respects the ownership rights of data originators but also, based on the usufructuary rights of enterprises, constructs a corresponding rights system for different stages of data processing activities. The subjects of data property rights include both data originators, such as users, and data producers, such as enterprises, who enjoy different rights at different stages of data processing. The intellectual property rights system, with the mission of incentivizing innovation and promoting the advancement of science, culture, and the arts, provides a complete set of mechanisms for protecting innovative results. However, unlike traditional private property rights, the granting of intellectual property rights is not an end in itself; the purpose of the intellectual property system is to balance the exclusive rights of the rights holders with the prosperity and long-term development of society's public learning and the entire field of science, culture, and the arts. Therefore, the intellectual property granting mechanism provides both protection and limitations for the rights holder. This perfectly aligns with the dual attributes of data. In terms of achieving the protection of data property rights, the granting of intellectual property rights is an important institutional choice that can enhance the effectiveness of the data property exchange mechanism. Although this is not the only path, the granting of data property rights within the framework of the intellectual property rights system helps to establish fundamental legal relationships and rights confirmation mechanisms and is more compatible with the classification and grading system of data. The modernity of the intellectual property rights system allows it to adapt to the needs of big data technology development through special clauses or industry guidelines, thus promoting the comprehensive advancement of data intellectual property rights legislation. This paper analyzes data protection under the virtual property layer theory and two-fold virtual property rights system. Based on the “bundle of right” theory, this paper establishes specific three-level data rights. This paper analyzes the cases: Google v. Vidal-Hall, Halliday v Creation Consumer Finance, Douglas v Hello Limited, Campbell v MGN and Imerman v Tchenquiz. This paper concluded that recognizing property rights over personal data and protecting data under the framework of intellectual property will be beneficial to establish the tort of misuse of personal information.Keywords: data protection, property rights, intellectual property, Big data
Procedia PDF Downloads 39131 Collaborative Procurement in the Pursuit of Net- Zero: A Converging Journey
Authors: Bagireanu Astrid, Bros-Williamson Julio, Duncheva Mila, Currie John
Abstract:
The Architecture, Engineering, and Construction (AEC) sector plays a critical role in the global transition toward sustainable and net-zero built environments. However, the industry faces unique challenges in planning for net-zero while struggling with low productivity, cost overruns and overall resistance to change. Traditional practices fall short due to their inability to meet the requirements for systemic change, especially as governments increasingly demand transformative approaches. Working in silos and rigid hierarchies and a short-term, client-centric approach prioritising immediate gains over long-term benefit stands in stark contrast to the fundamental requirements for the realisation of net-zero objectives. These practices have limited capacity to effectively integrate AEC stakeholders and promote the essential knowledge sharing required to address the multifaceted challenges of achieving net-zero. In the context of built environment, procurement may be described as the method by which a project proceeds from inception to completion. Collaborative procurement methods under the Integrated Practices (IP) umbrella have the potential to align more closely with net-zero objectives. This paper explores the synergies between collaborative procurement principles and the pursuit of net zero in the AEC sector, drawing upon the shared values of cross-disciplinary collaboration, Early Supply Chain involvement (ESI), use of standards and frameworks, digital information management, strategic performance measurement, integrated decision-making principles and contractual alliancing. To investigate the role of collaborative procurement in advancing net-zero objectives, a structured research methodology was employed. First, the study focuses on a systematic review on the application of collaborative procurement principles in the AEC sphere. Next, a comprehensive analysis is conducted to identify common clusters of these principles across multiple procurement methods. An evaluative comparison between traditional procurement methods and collaborative procurement for achieving net-zero objectives is presented. Then, the study identifies the intersection between collaborative procurement principles and the net-zero requirements. Lastly, an exploration of key insights for AEC stakeholders focusing on the implications and practical applications of these findings is made. Directions for future development of this research are recommended. Adopting collaborative procurement principles can serve as a strategic framework for guiding the AEC sector towards realising net-zero. Synergising these approaches overcomes fragmentation, fosters knowledge sharing, and establishes a net-zero-centered ecosystem. In the context of the ongoing efforts to amplify project efficiency within the built environment, a critical realisation of their central role becomes imperative for AEC stakeholders. When effectively leveraged, collaborative procurement emerges as a powerful tool to surmount existing challenges in attaining net-zero objectives.Keywords: collaborative procurement, net-zero, knowledge sharing, architecture, built environment
Procedia PDF Downloads 73130 The Quantum Theory of Music and Languages
Authors: Mballa Abanda Serge, Henda Gnakate Biba, Romaric Guemno Kuate, Akono Rufine Nicole, Petfiang Sidonie, Bella Sidonie
Abstract:
The main hypotheses proposed around the definition of the syllable and of music, of the common origin of music and language, should lead the reader to reflect on the cross-cutting questions raised by the debate on the notion of universals in linguistics and musicology. These are objects of controversy, and there lies its interest: the debate raises questions that are at the heart of theories on language. It is an inventive, original and innovative research thesis. The main hypotheses proposed around the definition of the syllable and of music, of the common origin of music and language, should lead the reader to reflect on the cross-cutting questions raised by the debate on the notion of universals in linguistics and musicology. These are objects of controversy, and there lies its interest: the debate raises questions that are at the heart of theories on language. It is an inventive, original and innovative research thesis. A contribution to the theoretical, musicological, ethno musicological and linguistic conceptualization of languages, giving rise to the practice of interlocution between the social and cognitive sciences, the activities of artistic creation and the question of modeling in the human sciences: mathematics, computer science, translation automation and artificial intelligence. When you apply this theory to any text of a folksong of a world-tone language, you do not only piece together the exact melody, rhythm, and harmonies of that song as if you knew it in advance but also the exact speaking of this language. The author believes that the issue of the disappearance of tonal languages and their preservation has been structurally resolved, as well as one of the greatest cultural equations related to the composition and creation of tonal, polytonal and random music. The experimentation confirming the theorization, It designed a semi-digital, semi-analog application which translates the tonal languages of Africa (about 2,100 languages) into blues, jazz, world music, polyphonic music, tonal and anatonal music and deterministic and random music). To test this application, I use a music reading and writing software that allows me to collect the data extracted from my mother tongue, which is already modeled in the musical staves saved in the ethnographic (semiotic) dictionary for automatic translation ( volume 2 of the book). Translation is done (from writing to writing, from writing to speech and from writing to music). Mode of operation: you type a text on your computer, a structured song (chorus-verse), and you command the machine a melody of blues, jazz and world music or variety etc. The software runs, giving you the option to choose harmonies, and then you select your melody.Keywords: music, entanglement, langauge, science
Procedia PDF Downloads 80129 The Use of Image Analysis Techniques to Describe a Cluster Cracks in the Cement Paste with the Addition of Metakaolinite
Authors: Maciej Szeląg, Stanisław Fic
Abstract:
The impact of elevated temperatures on the construction materials manifests in change of their physical and mechanical characteristics. Stresses and thermal deformations that occur inside the volume of the material cause its progressive degradation as temperature increase. Finally, the reactions and transformations of multiphase structure of cementitious composite cause its complete destruction. A particularly dangerous phenomenon is the impact of thermal shock – a sudden high temperature load. The thermal shock leads to a high value of the temperature gradient between the outer surface and the interior of the element in a relatively short time. The result of mentioned above process is the formation of the cracks and scratches on the material’s surface and inside the material. The article describes the use of computer image analysis techniques to identify and assess the structure of the cluster cracks on the surfaces of modified cement pastes, caused by thermal shock. Four series of specimens were tested. Two Portland cements were used (CEM I 42.5R and CEM I 52,5R). In addition, two of the series contained metakaolinite as a replacement for 10% of the cement content. Samples in each series were made in combination of three w/b (water/binder) indicators of respectively 0.4; 0.5; 0.6. Surface cracks of the samples were created by a sudden temperature load at 200°C for 4 hours. Images of the cracked surfaces were obtained via scanning at 1200 DPI; digital processing and measurements were performed using ImageJ v. 1.46r software. In order to examine the cracked surface of the cement paste as a system of closed clusters – the dispersal systems theory was used to describe the structure of cement paste. Water is used as the dispersing phase, and the binder is used as the dispersed phase – which is the initial stage of cement paste structure creation. A cluster itself is considered to be the area on the specimen surface that is limited by cracks (created by sudden temperature loading) or by the edge of the sample. To describe the structure of cracks two stereological parameters were proposed: A ̅ – the cluster average area, L ̅ – the cluster average perimeter. The goal of this study was to compare the investigated stereological parameters with the mechanical properties of the tested specimens. Compressive and tensile strength testes were carried out according to EN standards. The method used in the study allowed the quantitative determination of defects occurring in the examined modified cement pastes surfaces. Based on the results, it was found that the nature of the cracks depends mainly on the physical parameters of the cement and the intermolecular interactions on the dispersal environment. Additionally, it was noted that the A ̅/L ̅ relation of created clusters can be described as one function for all tested samples. This fact testifies about the constant geometry of the thermal cracks regardless of the presence of metakaolinite, the type of cement and the w/b ratio.Keywords: cement paste, cluster cracks, elevated temperature, image analysis, metakaolinite, stereological parameters
Procedia PDF Downloads 388128 Survey of Prevalence of Noise Induced Hearing Loss in Hawkers and Shopkeepers in Noisy Areas of Mumbai City
Authors: Hitesh Kshayap, Shantanu Arya, Ajay Basod, Sachin Sakhuja
Abstract:
This study was undertaken to measure the overall noise levels in different locations/zones and to estimate the prevalence of Noise induced hearing loss in Hawkers & Shopkeepers in Mumbai, India. The Hearing Test developed by American Academy Of Otolaryngology, translated from English to Hindi, and validated is used as a screening tool for hearing sensitivity was employed. The tool is having 14 items. Each item is scored on a scale 0, 1, 2 and 3. The score 6 and above indicated some difficulty or definite difficulty in hearing in daily activities and low score indicated lesser difficulty or normal hearing. The subjects who scored 6 or above or having tinnitus were made to undergo hearing evaluation by Pure tone audiometer. Further, the environmental noise levels were measured from Morning to Evening at road side at different Location/Hawking zones in Mumbai city using SLM9 Agronic 8928B & K type Digital Sound Level Meter) in dB (A). The maximum noise level of 100.0 dB (A) was recorded during evening hours from Chattrapati Shivaji Terminal to Colaba with overall noise level of 79.0 dB (A). However, the minimum noise level in this area was 72.6 dB (A) at any given point of time. Further, 54.6 dB (A) was recorded as minimum noise level during 8-9 am at Sion Circle. Further, commencement of flyovers with 2-tier traffic, sky walks, increasing number of vehicular traffic at road, high rise buildings and other commercial & urbanization activities in the Mumbai city most probably have resulted in increasing the overall environmental noise levels. Trees which acted as noise absorbers have been cut owing to rapid construction. The study involved 100 participants in the age range of 18 to 40 years of age, with the mean age of 29 years (S.D. =6.49). 46 participants having tinnitus or have obtained the score of 6 were made to undergo Pure Tone Audiometry and it was found that the prevalence rate of hearing loss in hawkers & shopkeepers is 19% (10% Hawkers and 9 % Shopkeepers). The results found indicates that 29 (42.6%) out of 64 Hawkers and 17 (47.2%) out of 36 Shopkeepers who underwent PTA had no significant difference in percentage of Noise Induced Hearing loss. The study results also reveal that participants who exhibited tinnitus 19 (41.30%) out of 46 were having mild to moderate sensorineural hearing loss between 3000Hz to 6000Hz. The Pure tone Audiogram pattern revealed Hearing loss at 4000 Hz and 6000 Hz while hearing at adjacent frequencies were nearly normal. 7 hawkers and 8 shopkeepers had mild notch while 3 hawkers and 1 shopkeeper had a moderate degree of notch. It is thus inferred that tinnitus is a strong indicator for presence of hearing loss and 4/6 KHz notch is a strong marker for road/traffic/ environmental noise as an occupational hazard for hawkers and shopkeepers. Mass awareness about these occupational hazards, regular hearing check up, early intervention along with sustainable development juxtaposed with social and urban forestry can help in this regard.Keywords: NIHL, noise, sound level meter, tinnitus
Procedia PDF Downloads 199127 Ammonia Bunkering Spill Scenarios: Modelling Plume’s Behaviour and Potential to Trigger Harmful Algal Blooms in the Singapore Straits
Authors: Bryan Low
Abstract:
In the coming decades, the global maritime industry will face a most formidable environmental challenge -achieving net zero carbon emissions by 2050. To meet this target, the Maritime Port Authority of Singapore (MPA) has worked to establish green shipping and digital corridors with ports of several other countries around the world where ships will use low-carbon alternative fuels such as ammonia for power generation. While this paradigm shift to the bunkering of greener fuels is encouraging, fuels like ammonia will also introduce a new and unique type of environmental risk in the unlikely scenario of a spill. While numerous modelling studies have been conducted for oil spills and their associated environmental impact on coastal and marine ecosystems, ammonia spills are comparatively less well understood. For example, there is a knowledge gap regarding how the complex hydrodynamic conditions of the Singapore Straits may influence the dispersion of a hypothetical ammonia plume, which has different physical and chemical properties compared to an oil slick. Chemically, ammonia can be absorbed by phytoplankton, thus altering the balance of the marine nitrogen cycle. Biologically, ammonia generally serves the role of a nutrient in coastal ecosystems at lower concentrations. However, at higher concentrations, it has been found to be toxic to many local species. It may also have the potential to trigger eutrophication and harmful algal blooms (HABs) in coastal waters, depending on local hydrodynamic conditions. Thus, the key objective of this research paper is to support the development of a model-based forecasting system that can predict ammonia plume behaviour in coastal waters, given prevailing hydrodynamic conditions and their environmental impact. This will be essential as ammonia bunkering becomes more commonplace in Singapore’s ports and around the world. Specifically, this system must be able to assess the HAB-triggering potential of an ammonia plume, as well as its lethal and sub-lethal toxic effects on local species. This will allow the relevant authorities to better plan risk mitigation measures or choose a time window with the ideal hydrodynamic conditions to conduct ammonia bunkering operations with minimal risk. In this paper, we present the first part of such a forecasting system: a jointly coupled hydrodynamic-water quality model that can capture how advection-diffusion processes driven by ocean currents influence plume behaviour and how the plume interacts with the marine nitrogen cycle. The model is then applied to various ammonia spill scenarios where the results are discussed in the context of current ammonia toxicity guidelines, impact on local ecosystems, and mitigation measures for future bunkering operations conducted in the Singapore Straits.Keywords: ammonia bunkering, forecasting, harmful algal blooms, hydrodynamics, marine nitrogen cycle, oceanography, water quality modeling
Procedia PDF Downloads 83126 Risks beyond Cyber in IoT Infrastructure and Services
Authors: Mattias Bergstrom
Abstract:
Significance of the Study: This research will provide new insights into the risks with digital embedded infrastructure. Through this research, we will analyze each risk and its potential negation strategies, especially for AI and autonomous automation. Moreover, the analysis that is presented in this paper will convey valuable information for future research that can create more stable, secure, and efficient autonomous systems. To learn and understand the risks, a large IoT system was envisioned, and risks with hardware, tampering, and cyberattacks were collected, researched, and evaluated to create a comprehensive understanding of the potential risks. Potential solutions have then been evaluated on an open source IoT hardware setup. This list shows the identified passive and active risks evaluated in the research. Passive Risks: (1) Hardware failures- Critical Systems relying on high rate data and data quality are growing; SCADA systems for infrastructure are good examples of such systems. (2) Hardware delivers erroneous data- Sensors break, and when they do so, they don’t always go silent; they can keep going, just that the data they deliver is garbage, and if that data is not filtered out, it becomes disruptive noise in the system. (3) Bad Hardware injection- Erroneous generated sensor data can be pumped into a system by malicious actors with the intent to create disruptive noise in critical systems. (4) Data gravity- The weight of the data collected will affect Data-Mobility. (5) Cost inhibitors- Running services that need huge centralized computing is cost inhibiting. Large complex AI can be extremely expensive to run. Active Risks: Denial of Service- It is one of the most simple attacks, where an attacker just overloads the system with bogus requests so that valid requests disappear in the noise. Malware- Malware can be anything from simple viruses to complex botnets created with specific goals, where the creator is stealing computer power and bandwidth from you to attack someone else. Ransomware- It is a kind of malware, but it is so different in its implementation that it is worth its own mention. The goal with these pieces of software is to encrypt your system so that it can only be unlocked with a key that is held for ransom. DNS spoofing- By spoofing DNS calls, valid requests and data dumps can be sent to bad destinations, where the data can be extracted for extortion or to corrupt and re-inject into a running system creating a data echo noise loop. After testing multiple potential solutions. We found that the most prominent solution to these risks was to use a Peer 2 Peer consensus algorithm over a blockchain to validate the data and behavior of the devices (sensors, storage, and computing) in the system. By the devices autonomously policing themselves for deviant behavior, all risks listed above can be negated. In conclusion, an Internet middleware that provides these features would be an easy and secure solution to any future autonomous IoT deployments. As it provides separation from the open Internet, at the same time, it is accessible over the blockchain keys.Keywords: IoT, security, infrastructure, SCADA, blockchain, AI
Procedia PDF Downloads 107125 The Impact of the Covid-19 Crisis on the Information Behavior in the B2B Buying Process
Authors: Stehr Melanie
Abstract:
The availability of apposite information is essential for the decision-making process of organizational buyers. Due to the constraints of the Covid-19 crisis, information channels that emphasize face-to-face contact (e.g. sales visits, trade shows) have been unavailable, and usage of digitally-driven information channels (e.g. videoconferencing, platforms) has skyrocketed. This paper explores the question in which areas the pandemic induced shift in the use of information channels could be sustainable and in which areas it is a temporary phenomenon. While information and buying behavior in B2C purchases has been regularly studied in the last decade, the last fundamental model of organizational buying behavior in B2B was introduced by Johnston and Lewin (1996) in times before the advent of the internet. Subsequently, research efforts in B2B marketing shifted from organizational buyers and their decision and information behavior to the business relationships between sellers and buyers. This study builds on the extensive literature on situational factors influencing organizational buying and information behavior and uses the economics of information theory as a theoretical framework. The research focuses on the German woodworking industry, which before the Covid-19 crisis was characterized by a rather low level of digitization of information channels. By focusing on an industry with traditional communication structures, a shift in information behavior induced by an exogenous shock is considered a ripe research setting. The study is exploratory in nature. The primary data source is 40 in-depth interviews based on the repertory-grid method. Thus, 120 typical buying situations in the woodworking industry and the information and channels relevant to them are identified. The results are combined into clusters, each of which shows similar information behavior in the procurement process. In the next step, the clusters are analyzed in terms of the post and pre-Covid-19 crisis’ behavior identifying stable and dynamic information behavior aspects. Initial results show that, for example, clusters representing search goods with low risk and complexity suggest a sustainable rise in the use of digitally-driven information channels. However, in clusters containing trust goods with high significance and novelty, an increased return to face-to-face information channels can be expected after the Covid-19 crisis. The results are interesting from both a scientific and a practical point of view. This study is one of the first to apply the economics of information theory to organizational buyers and their decision and information behavior in the digital information age. Especially the focus on the dynamic aspects of information behavior after an exogenous shock might contribute new impulses to theoretical debates related to the economics of information theory. For practitioners - especially suppliers’ marketing managers and intermediaries such as publishers or trade show organizers from the woodworking industry - the study shows wide-ranging starting points for a future-oriented segmentation of their marketing program by highlighting the dynamic and stable preferences of elaborated clusters in the choice of their information channels.Keywords: B2B buying process, crisis, economics of information theory, information channel
Procedia PDF Downloads 184124 Time Travel Testing: A Mechanism for Improving Renewal Experience
Authors: Aritra Majumdar
Abstract:
While organizations strive to expand their new customer base, retaining existing relationships is a key aspect of improving overall profitability and also showcasing how successful an organization is in holding on to its customers. It is an experimentally proven fact that the lion’s share of profit always comes from existing customers. Hence seamless management of renewal journeys across different channels goes a long way in improving trust in the brand. From a quality assurance standpoint, time travel testing provides an approach to both business and technology teams to enhance the customer experience when they look to extend their partnership with the organization for a defined phase of time. This whitepaper will focus on key pillars of time travel testing: time travel planning, time travel data preparation, and enterprise automation. Along with that, it will call out some of the best practices and common accelerator implementation ideas which are generic across verticals like healthcare, insurance, etc. In this abstract document, a high-level snapshot of these pillars will be provided. Time Travel Planning: The first step of setting up a time travel testing roadmap is appropriate planning. Planning will include identifying the impacted systems that need to be time traveled backward or forward depending on the business requirement, aligning time travel with other releases, frequency of time travel testing, preparedness for handling renewal issues in production after time travel testing is done and most importantly planning for test automation testing during time travel testing. Time Travel Data Preparation: One of the most complex areas in time travel testing is test data coverage. Aligning test data to cover required customer segments and narrowing it down to multiple offer sequencing based on defined parameters are keys for successful time travel testing. Another aspect is the availability of sufficient data for similar combinations to support activities like defect retesting, regression testing, post-production testing (if required), etc. This section will talk about the necessary steps for suitable data coverage and sufficient data availability from a time travel testing perspective. Enterprise Automation: Time travel testing is never restricted to a single application. The workflow needs to be validated in the downstream applications to ensure consistency across the board. Along with that, the correctness of offers across different digital channels needs to be checked in order to ensure a smooth customer experience. This section will talk about the focus areas of enterprise automation and how automation testing can be leveraged to improve the overall quality without compromising on the project schedule. Along with the above-mentioned items, the white paper will elaborate on the best practices that need to be followed during time travel testing and some ideas pertaining to accelerator implementation. To sum it up, this paper will be written based on the real-time experience author had on time travel testing. While actual customer names and program-related details will not be disclosed, the paper will highlight the key learnings which will help other teams to implement time travel testing successfully.Keywords: time travel planning, time travel data preparation, enterprise automation, best practices, accelerator implementation ideas
Procedia PDF Downloads 159123 Comparative Study of Active Release Technique and Myofascial Release Technique in Patients with Upper Trapezius Spasm
Authors: Harihara Prakash Ramanathan, Daksha Mishra, Ankita Dhaduk
Abstract:
Relevance: This qualitative study will educate the clinician in putting into practice the advanced method of movement science in restoring the function. Purpose: The purpose of this study is to compare the effectiveness of Active Release Technique and myofascial release technique on range of motion, neck function and pain in patients with upper trapezius spasm. Methods/Analysis: The study was approved by the institutional Human Research and Ethics committee. This study included sixty patients of age group between 20 to 55 years with upper trapezius spasm. Patients were randomly divided into two groups receiving Active Release Technique (Group A) and Myofascial Release Technique (Group B). The patients were treated for 1 week and three outcome measures ROM, pain and functional level were measured using Goniometer, Visual analog scale(VAS), Neck disability Index Questionnaire(NDI) respectively. Paired Sample 't' test was used to compare the differences of pre and post intervention values of Cervical Range of motion, Neck disability Index, Visual analog scale of Group A and Group B. Independent't' test was used to compare the differences between two groups in terms of improvement in cervical range of motion, decrease in visual analogue scale(VAS), decrease in Neck disability index score. Results: Both the groups showed statistically significant improvements in cervical ROM, reduction in pain and in NDI scores. However, mean change in Cervical flexion, cervical extension, right side flexion, left side flexion, right side rotation, left side rotation, pain, neck disability level showed statistically significant improvement (P < 0. 05)) in the patients who received Active Release Technique as compared to Myofascial release technique. Discussion and conclusions: In present study, the average improvement immediately post intervention is significantly greater as compared to before treatment but there is even more improvement after seven sessions as compared to single session. Hence, this proves that several sessions of Manual techniques are necessary to produce clinically relevant results. Active release technique help to reduce the pain threshold by removing adhesion and promote normal tissue extensibility. The act of tensioning and compressing the affected tissue both with digital contact and through the active movement performed by the patient can be a plausible mechanism for tissue healing in this study. This study concluded that both Active Release Technique (ART) and Myofascial release technique (MFR) are equally effective in managing upper trapezius muscle spasm, but more improvement can be achieved by Active Release Technique (ART). Impact and Implications: Active Release Technique can be adopted as mainstay of treatment approach in treating trapezius spasm for faster relief and improving the functional status.Keywords: trapezius spasm, myofascial release, active release technique, pain
Procedia PDF Downloads 273122 Network Impact of a Social Innovation Initiative in Rural Areas of Southern Italy
Authors: A. M. Andriano, M. Lombardi, A. Lopolito, M. Prosperi, A. Stasi, E. Iannuzzi
Abstract:
In according to the scientific debate on the definition of Social Innovation (SI), the present paper identifies SI as new ideas (products, services, and models) that simultaneously meet social needs and create new social relationships or collaborations. This concept offers important tools to unravel the difficult condition for the agricultural sector in marginalized areas, characterized by the abandonment of activities, low level of farmer education, and low generational renewal, hampering new territorial strategies addressed at and integrated and sustainable development. Models of SI in agriculture, starting from bottom up approach or from the community, are considered to represent the driving force of an ecological and digital revolution. A system based on SI may be able to grasp and satisfy individual and social needs and to promote new forms of entrepreneurship. In this context, Vazapp ('Go Hoeing') is an emerging SI model in southern Italy that promotes solutions for satisfying needs of farmers and facilitates their relationships (creation of network). The Vazapp’s initiative, considered in this study, is the Contadinners ('Farmer’s dinners'), a dinner held at farmer’s house where stakeholders living in the surrounding area know each other and are able to build a network for possible future professional collaborations. The aim of the paper is to identify the evolution of farmers’ relationships, both quantitatively and qualitatively, because of the Contadinner’s initiative organized by Vazapp. To this end, the study adopts the Social Network Analysis (SNA) methodology by using UCINET (Version 6.667) software to analyze the relational structure. Data collection was realized through a questionnaire distributed to 387 participants in the twenty 'Contadinners', held from February 2016 to June 2018. The response rate to the survey was about 50% of farmers. The elaboration data was focused on different aspects, such as: a) the measurement of relational reciprocity among the farmers using the symmetrize method of answers; b) the measurement of the answer reliability using the dichotomize method; c) the description of evolution of social capital using the cohesion method; d) the clustering of the Contadinners' participants in followers and not-followers of Vazapp to evaluate its impact on the local social capital. The results concern the effectiveness of this initiative in generating trustworthy relationships within the rural area of southern Italy, typically affected by individualism and mistrust. The number of relationships represents the quantitative indicator to define the dimension of the network development; while the typologies of relationships (from simple friendship to formal collaborations, for branding new cooperation initiatives) represents the qualitative indicator that offers a diversified perspective of the network impact. From the analysis carried out, Vazapp’s initiative represents surely a virtuous SI model to catalyze the relationships within the rural areas and to develop entrepreneurship based on the real needs of the community. Procedia PDF Downloads 111121 Edge Enhancement Visual Methodology for Fat Amount and Distribution Assessment in Dry-Cured Ham Slices
Authors: Silvia Grassi, Stefano Schiavon, Ernestina Casiraghi, Cristina Alamprese
Abstract:
Dry-cured ham is an uncooked meat product particularly appreciated for its peculiar sensory traits among which lipid component plays a key role in defining quality and, consequently, consumers’ acceptability. Usually, fat content and distribution are chemically determined by expensive, time-consuming, and destructive analyses. Moreover, different sensory techniques are applied to assess product conformity to desired standards. In this context, visual systems are getting a foothold in the meat market envisioning more reliable and time-saving assessment of food quality traits. The present work aims at developing a simple but systematic and objective visual methodology to assess the fat amount of dry-cured ham slices, in terms of total, intermuscular and intramuscular fractions. To the aim, 160 slices from 80 PDO dry-cured hams were evaluated by digital image analysis and Soxhlet extraction. RGB images were captured by a flatbed scanner, converted in grey-scale images, and segmented based on intensity histograms as well as on a multi-stage algorithm aimed at edge enhancement. The latter was performed applying the Canny algorithm, which consists of image noise reduction, calculation of the intensity gradient for each image, spurious response removal, actual thresholding on corrected images, and confirmation of strong edge boundaries. The approach allowed for the automatic calculation of total, intermuscular and intramuscular fat fractions as percentages of the total slice area. Linear regression models were run to estimate the relationships between the image analysis results and the chemical data, thus allowing for the prediction of the total, intermuscular and intramuscular fat content by the dry-cured ham images. The goodness of fit of the obtained models was confirmed in terms of coefficient of determination (R²), hypothesis testing and pattern of residuals. Good regression models have been found being 0.73, 0.82, and 0.73 the R2 values for the total fat, the sum of intermuscular and intramuscular fat and the intermuscular fraction, respectively. In conclusion, the edge enhancement visual procedure brought to a good fat segmentation making the simple visual approach for the quantification of the different fat fractions in dry-cured ham slices sufficiently simple, accurate and precise. The presented image analysis approach steers towards the development of instruments that can overcome destructive, tedious and time-consuming chemical determinations. As future perspectives, the results of the proposed image analysis methodology will be compared with those of sensory tests in order to develop a fast grading method of dry-cured hams based on fat distribution. Therefore, the system will be able not only to predict the actual fat content but it will also reflect the visual appearance of samples as perceived by consumers.Keywords: dry-cured ham, edge detection algorithm, fat content, image analysis
Procedia PDF Downloads 176120 Electrophoretic Light Scattering Based on Total Internal Reflection as a Promising Diagnostic Method
Authors: Ekaterina A. Savchenko, Elena N. Velichko, Evgenii T. Aksenov
Abstract:
The development of pathological processes, such as cardiovascular and oncological diseases, are accompanied by changes in molecular parameters in cells, tissues, and serum. The study of the behavior of protein molecules in solutions is of primarily importance for diagnosis of such diseases. Various physical and chemical methods are used to study molecular systems. With the advent of the laser and advances in electronics, optical methods, such as scanning electron microscopy, sedimentation analysis, nephelometry, static and dynamic light scattering, have become the most universal, informative and accurate tools for estimating the parameters of nanoscale objects. The electrophoretic light scattering is the most effective technique. It has a high potential in the study of biological solutions and their properties. This technique allows one to investigate the processes of aggregation and dissociation of different macromolecules and obtain information on their shapes, sizes and molecular weights. Electrophoretic light scattering is an analytical method for registration of the motion of microscopic particles under the influence of an electric field by means of quasi-elastic light scattering in a homogeneous solution with a subsequent registration of the spectral or correlation characteristics of the light scattered from a moving object. We modified the technique by using the regime of total internal reflection with the aim of increasing its sensitivity and reducing the volume of the sample to be investigated, which opens the prospects of automating simultaneous multiparameter measurements. In addition, the method of total internal reflection allows one to study biological fluids on the level of single molecules, which also makes it possible to increase the sensitivity and the informativeness of the results because the data obtained from an individual molecule is not averaged over an ensemble, which is important in the study of bimolecular fluids. To our best knowledge the study of electrophoretic light scattering in the regime of total internal reflection is proposed for the first time, latex microspheres 1 μm in size were used as test objects. In this study, the total internal reflection regime was realized on a quartz prism where the free electrophoresis regime was set. A semiconductor laser with a wavelength of 655 nm was used as a radiation source, and the light scattering signal was registered by a pin-diode. Then the signal from a photodetector was transmitted to a digital oscilloscope and to a computer. The autocorrelation functions and the fast Fourier transform in the regime of Brownian motion and under the action of the field were calculated to obtain the parameters of the object investigated. The main result of the study was the dependence of the autocorrelation function on the concentration of microspheres and the applied field magnitude. The effect of heating became more pronounced with increasing sample concentrations and electric field. The results obtained in our study demonstrated the applicability of the method for the examination of liquid solutions, including biological fluids.Keywords: light scattering, electrophoretic light scattering, electrophoresis, total internal reflection
Procedia PDF Downloads 214119 Managing Human-Wildlife Conflicts Compensation Claims Data Collection and Payments Using a Scheme Administrator
Authors: Eric Mwenda, Shadrack Ngene
Abstract:
Human-wildlife conflicts (HWCs) are the main threat to conservation in Africa. This is because wildlife needs overlap with those of humans. In Kenya, about 70% of wildlife occurs outside protected areas. As a result, wildlife and human range overlap, causing HWCs. The HWCs in Kenya occur in the drylands adjacent to protected areas. The top five counties with the highest incidences of HWC are Taita Taveta, Narok, Lamu, Kajiado, and Laikipia. The common wildlife species responsible for HWCs are elephants, buffaloes, hyenas, hippos, leopards, baboons, monkeys, snakes, and crocodiles. To ensure individuals affected by the conflicts are compensated, Kenya has developed a model of HWC compensation claims data collection and payment. We collected data on HWC from all eight Kenya Wildlife Service (KWS) Conservation Areas from 2009 to 2019. Additional data was collected from stakeholders' consultative workshops held in the Conservation Areas and a literature review regarding payment of injuries and ongoing insurance schemes being practiced in areas. This was followed by the description of the claims administration process and calculation of the pricing of the compensation claims. We further developed a digital platform for data capture and processing of all reported conflict cases and payments. Our product recognized four categories of HWC (i.e., human death and injury, property damage, crop destruction, and livestock predation). Personal bodily injury and human death were provided based on the Continental Scale of Benefits. We proposed a maximum of Kenya Shillings (KES) 3,000,000 for death. Medical, pharmaceutical, and hospital expenses were capped at a maximum of KES 150,000, as well as funeral costs at KES 50,000. Pain and suffering were proposed to be paid for 12 months at the rate of KES 13,500 per month. Crop damage was to be based on farm input costs at a maximum of KES 150,000 per claim. Livestock predation leading to death was based on Tropical Livestock Unit (TLU), which is equivalent to KES 30,000, whick includes Cattle (1 TLU = KES 30,000), Camel (1.4 TLU = KES 42,000), Goat (0.15 TLU = 4,500), Sheep (0.15 TLU = 4,500), and Donkey (0.5 TLU = KES 15,000). Property destruction (buildings, outside structures and harvested crops) was capped at KES 150,000 per any one claim. We conclude that it is possible to use an administrator to collect data on HWC compensation claims and make payments using technology. The success of the new approach will depend on a piloting program. We recommended that a pilot scheme be initiated for eight months in Taita Taveta, Kajiado, Baringo, Laikipia, Narok, and Meru Counties. This will test the claims administration process as well as harmonize data collection methods. The results of this pilot will be crucial in adjusting the scheme before country-wide roll out.Keywords: human-wildlife conflicts, compensation, human death and injury, crop destruction, predation, property destruction
Procedia PDF Downloads 55118 [Keynote Talk]: New Generations and Employment: An Exploratory Study about Tensions between the Psycho-Social Characteristics of the Generation Z and Expectations and Actions of Organizational Structures Related with Employment (CABA, 2016)
Authors: Esteban Maioli
Abstract:
Generational studies have an important research tradition in social and human sciences. On the one hand, the speed of social change in the context of globalization imposes the need to research the transformations are identified both the subjectivity of the agents involved and its inclusion in the institutional matrix, specifically employment. Generation Z, (generally considered as the population group whose birth occurs after 1995) have unique psycho-social characteristics. Gen Z is characterized by a different set of values, beliefs, attitudes and ambitions that impact in their concrete action in organizational structures. On the other hand, managers often have to deal with generational differences in the workplace. Organizations have members who belong to different generations; they had never before faced the challenge of having such a diverse group of members. The members of each historical generation are characterized by a different set of values, beliefs, attitudes and ambitions that are manifest in their concrete action in organizational structures. Gen Z it’s the only one who can fully be considered "global," while its members were born in the consolidated context of globalization. Some salient features of the Generation Z can be summarized as follows. They’re the first fully born into a digital world. Social networks and technology are integrated into their lives. They are concerned about the challenges of the modern world (poverty, inequality, climate change, among others). They are self-expressive, more liberal and open to change. They often bore easily, with short attention spans. They do not like routine tasks. They want to achieve a good life-work balance, and they are interested in a flexible work environment, as opposed to traditional work schedule. They are critical thinkers, who come with innovative and creative ideas to help. Research design considered methodological triangulation. Data was collected with two techniques: a self-administered survey with multiple choice questions and attitudinal scales applied over a non-probabilistic sample by reasoned decision. According to the multi-method strategy, also it was conducted in-depth interviews. Organizations constantly face new challenges. One of the biggest ones is to learn to manage a multi-generational scope of work. While Gen Z has not yet been fully incorporated (expected to do so in five years or so), many organizations have already begun to implement a series of changes in its recruitment and development. The main obstacle to retaining young talent is the gap between the expectations of iGen applicants and what companies offer. Members of the iGen expect not only a good salary and job stability but also a clear career plan. Generation Z needs to have immediate feedback on their tasks. However, many organizations have yet to improve both motivation and monitoring practices. It is essential for companies to take a review of organizational practices anchored in the culture of the organization.Keywords: employment, expectations, generation Z, organizational culture, organizations, psycho-social characteristics
Procedia PDF Downloads 201117 Highly Automated Trucks In Intermodal Logistics: Findings From a Field Test in Railport and Container Depot Operations in Germany
Authors: Dustin Schöder
Abstract:
The potential benefits of the utilization of highly automated and autonomous trucks in logistics operations are the subject of interest to the entire logistics industry. The benefits of the use of these new technologies were scientifically investigated and implemented in roadmaps. So far, reliable data and experiences from real life use cases are still limited. A German research consortium of both academics and industry developed a highly automated (SAE level 4) vehicle for yard operations at railports and container depots. After development and testing, a several month field test at the DUSS Terminal in Ulm-Dornstadt (Germany) and the nearby DB Intermodal Services Container Depot in Ulm-Dornstadt was conducted. The truck was piloted in a shuttle service between both sites. In a holistic automation approach, the vehicle was integrated into a digital communication platform so that the truck could move autonomously without a driver and his manual interactions with a wide variety of stakeholders. The main goal is to investigate the effects of highly automated trucks in the key processes of container loading, unloading and container relocation on holistic railport yard operation. The field test data were used to investigate changes in process efficiency of key processes of railport and container yard operations. Moreover, effects on the capacity utilization and potentials for smothering peak workloads were analyzed. The results state that process efficiency in the piloted use case was significantly higher. The reason for that could be found in the digitalized data exchange and automated dispatch. However, the field test has shown that the effect is greatly varying depending on the ratio of highly automated and manual trucks in the yard as well as on the congestion level in the loading area. Furthermore, the data confirmed that under the right conditions, the capacity utilization of highly automated trucks could be increased. In regard to the potential for smothering peak workloads, no significant findings could be made based on the limited requirements and regulations of railway operation in Germany. In addition, an empirical survey among railport managers, operational supervisors, innovation managers and strategists (n=15) within the logistics industry in Germany was conducted. The goal was to identify key characteristics of future railports and terminals as well as requirements that railports will have to meet in the future. Furthermore, the railport processes where automation and autonomization make the greatest impact, as well as hurdles and challenges in the introduction of new technologies, have been surveyed. Hence, further potential use cases of highly automated and autonomous applications could be identified, and expectations have been mapped. As a result, a highly detailed and practice-based roadmap towards a ‘terminal 4.0’ was developed.Keywords: highly automated driving, autonomous driving, SAE level 4, railport operations, container depot, intermodal logistics, potentials of autonomization
Procedia PDF Downloads 78116 Geovisualisation for Defense Based on a Deep Learning Monocular Depth Reconstruction Approach
Authors: Daniel R. dos Santos, Mateus S. Maldonado, Estevão J. R. Batista
Abstract:
The military commanders increasingly dependent on spatial awareness, as knowing where enemy are, understanding how war battle scenarios change over time, and visualizing these trends in ways that offer insights for decision-making. Thanks to advancements in geospatial technologies and artificial intelligence algorithms, the commanders are now able to modernize military operations on a universal scale. Thus, geovisualisation has become an essential asset in the defense sector. It has become indispensable for better decisionmaking in dynamic/temporal scenarios, operation planning and management for the war field, situational awareness, effective planning, monitoring, and others. For example, a 3D visualization of war field data contributes to intelligence analysis, evaluation of postmission outcomes, and creation of predictive models to enhance decision-making and strategic planning capabilities. However, old-school visualization methods are slow, expensive, and unscalable. Despite modern technologies in generating 3D point clouds, such as LIDAR and stereo sensors, monocular depth values based on deep learning can offer a faster and more detailed view of the environment, transforming single images into visual information for valuable insights. We propose a dedicated monocular depth reconstruction approach via deep learning techniques for 3D geovisualisation of satellite images. It introduces scalability in terrain reconstruction and data visualization. First, a dataset with more than 7,000 satellite images and associated digital elevation model (DEM) is created. It is based on high resolution optical and radar imageries collected from Planet and Copernicus, on which we fuse highresolution topographic data obtained using technologies such as LiDAR and the associated geographic coordinates. Second, we developed an imagery-DEM fusion strategy that combine feature maps from two encoder-decoder networks. One network is trained with radar and optical bands, while the other is trained with DEM features to compute dense 3D depth. Finally, we constructed a benchmark with sparse depth annotations to facilitate future research. To demonstrate the proposed method's versatility, we evaluated its performance on no annotated satellite images and implemented an enclosed environment useful for Geovisualisation applications. The algorithms were developed in Python 3.0, employing open-source computing libraries, i.e., Open3D, TensorFlow, and Pythorch3D. The proposed method provides fast and accurate decision-making with GIS for localization of troops, position of the enemy, terrain and climate conditions. This analysis enhances situational consciousness, enabling commanders to fine-tune the strategies and distribute the resources proficiently.Keywords: depth, deep learning, geovisualisation, satellite images
Procedia PDF Downloads 8115 Planning for Location and Distribution of Regional Facilities Using Central Place Theory and Location-Allocation Model
Authors: Danjuma Bawa
Abstract:
This paper aimed at exploring the capabilities of Location-Allocation model in complementing the strides of the existing physical planning models in the location and distribution of facilities for regional consumption. The paper was designed to provide a blueprint to the Nigerian government and other donor agencies especially the Fertilizer Distribution Initiative (FDI) by the federal government for the revitalization of the terrorism ravaged regions. Theoretical underpinnings of central place theory related to spatial distribution, interrelationships, and threshold prerequisites were reviewed. The study showcased how Location-Allocation Model (L-AM) alongside Central Place Theory (CPT) was applied in Geographic Information System (GIS) environment to; map and analyze the spatial distribution of settlements; exploit their physical and economic interrelationships, and to explore their hierarchical and opportunistic influences. The study was purely spatial qualitative research which largely used secondary data such as; spatial location and distribution of settlements, population figures of settlements, network of roads linking them and other landform features. These were sourced from government ministries and open source consortium. GIS was used as a tool for processing and analyzing such spatial features within the dictum of CPT and L-AM to produce a comprehensive spatial digital plan for equitable and judicious location and distribution of fertilizer deports in the study area in an optimal way. Population threshold was used as yardstick for selecting suitable settlements that could stand as service centers to other hinterlands; this was accomplished using the query syntax in ArcMapTM. ArcGISTM’ network analyst was used in conducting location-allocation analysis for apportioning of groups of settlements around such service centers within a given threshold distance. Most of the techniques and models ever used by utility planners have been centered on straight distance to settlements using Euclidean distances. Such models neglect impedance cutoffs and the routing capabilities of networks. CPT and L-AM take into consideration both the influential characteristics of settlements and their routing connectivity. The study was undertaken in two terrorism ravaged Local Government Areas of Adamawa state. Four (4) existing depots in the study area were identified. 20 more depots in 20 villages were proposed using suitability analysis. Out of the 300 settlements mapped in the study area about 280 of such settlements where optimally grouped and allocated to the selected service centers respectfully within 2km impedance cutoff. This study complements the giant strides by the federal government of Nigeria by providing a blueprint for ensuring proper distribution of these public goods in the spirit of bringing succor to these terrorism ravaged populace. This will ardently at the same time help in boosting agricultural activities thereby lowering food shortage and raising per capita income as espoused by the government.Keywords: central place theory, GIS, location-allocation, network analysis, urban and regional planning, welfare economics
Procedia PDF Downloads 147114 Automated System: Managing the Production and Distribution of Radiopharmaceuticals
Authors: Shayma Mohammed, Adel Trabelsi
Abstract:
Radiopharmacy is the art of preparing high-quality, radioactive, medicinal products for use in diagnosis and therapy. Radiopharmaceuticals unlike normal medicines, this dual aspect (radioactive, medical) makes their management highly critical. One of the most convincing applications of modern technologies is the ability to delegate the execution of repetitive tasks to programming scripts. Automation has found its way to the most skilled jobs, to improve the company's overall performance by allowing human workers to focus on more important tasks than document filling. This project aims to contribute to implement a comprehensive system to insure rigorous management of radiopharmaceuticals through the use of a platform that links the Nuclear Medicine Service Management System to the Nuclear Radio-pharmacy Management System in accordance with the recommendations of World Health Organization (WHO) and International Atomic Energy Agency (IAEA). In this project we attempt to build a web application that targets radiopharmacies, the platform is built atop the inherently compatible web stack which allows it to work in virtually any environment. Different technologies are used in this project (PHP, Symfony, MySQL Workbench, Bootstrap, Angular 7, Visual Studio Code and TypeScript). The operating principle of the platform is mainly based on two parts: Radiopharmaceutical Backoffice for the Radiopharmacian, who is responsible for the realization of radiopharmaceutical preparations and their delivery and Medical Backoffice for the Doctor, who holds the authorization for the possession and use of radionuclides and he/she is responsible for ordering radioactive products. The application consists of sven modules: Production, Quality Control/Quality Assurance, Release, General Management, References, Transport and Stock Management. It allows 8 classes of users: The Production Manager (PM), Quality Control Manager (QCM), Stock Manager (SM), General Manager (GM), Client (Doctor), Parking and Transport Manager (PTM), Qualified Person (QP) and Technical and Production Staff. Digital platform bringing together all players involved in the use of radiopharmaceuticals and integrating the stages of preparation, production and distribution, Web technologies, in particular, promise to offer all the benefits of automation while requiring no more than a web browser to act as a user client, which is a strength because the web stack is by nature multi-platform. This platform will provide a traceability system for radiopharmaceuticals products to ensure the safety and radioprotection of actors and of patients. The new integrated platform is an alternative to write all the boilerplate paperwork manually, which is a tedious and error-prone task. It would minimize manual human manipulation, which has proven to be the main source of error in nuclear medicine. A codified electronic transfer of information from radiopharmaceutical preparation to delivery will further reduce the risk of maladministration.Keywords: automated system, management, radiopharmacy, technical papers
Procedia PDF Downloads 156113 Platform Virtual for Joint Amplitude Measurement Based in MEMS
Authors: Mauro Callejas-Cuervo, Andrea C. Alarcon-Aldana, Andres F. Ruiz-Olaya, Juan C. Alvarez
Abstract:
Motion capture (MC) is the construction of a precise and accurate digital representation of a real motion. Systems have been used in the last years in a wide range of applications, from films special effects and animation, interactive entertainment, medicine, to high competitive sport where a maximum performance and low injury risk during training and competition is seeking. This paper presents an inertial and magnetic sensor based technological platform, intended for particular amplitude monitoring and telerehabilitation processes considering an efficient cost/technical considerations compromise. Our platform particularities offer high social impact possibilities by making telerehabilitation accessible to large population sectors in marginal socio-economic sector, especially in underdeveloped countries that in opposition to developed countries specialist are scarce, and high technology is not available or inexistent. This platform integrates high-resolution low-cost inertial and magnetic sensors with adequate user interfaces and communication protocols to perform a web or other communication networks available diagnosis service. The amplitude information is generated by sensors then transferred to a computing device with adequate interfaces to make it accessible to inexperienced personnel, providing a high social value. Amplitude measurements of the platform virtual system presented a good fit to its respective reference system. Analyzing the robotic arm results (estimation error RMSE 1=2.12° and estimation error RMSE 2=2.28°), it can be observed that during arm motion in any sense, the estimation error is negligible; in fact, error appears only during sense inversion what can easily be explained by the nature of inertial sensors and its relation to acceleration. Inertial sensors present a time constant delay which acts as a first order filter attenuating signals at large acceleration values as is the case for a change of sense in motion. It can be seen a damped response of platform virtual in other images where error analysis show that at maximum amplitude an underestimation of amplitude is present whereas at minimum amplitude estimations an overestimation of amplitude is observed. This work presents and describes the platform virtual as a motion capture system suitable for telerehabilitation with the cost - quality and precision - accessibility relations optimized. These particular characteristics achieved by efficiently using the state of the art of accessible generic technology in sensors and hardware, and adequate software for capture, transmission analysis and visualization, provides the capacity to offer good telerehabilitation services, reaching large more or less marginal populations where technologies and specialists are not available but accessible with basic communication networks.Keywords: inertial sensors, joint amplitude measurement, MEMS, telerehabilitation
Procedia PDF Downloads 259112 Comparative Analysis of Mechanical Properties of Paddy Rice for Different Variety-Moisture Content Interactions
Authors: Johnson Opoku-Asante, Emmanuel Bobobee, Joseph Akowuah, Eric Amoah Asante
Abstract:
In recent years, the issue of postharvest losses has become a serious concern in Sub-Saharan Africa. Postharvest technology development and adaptation need urgent attention, particularly for small and medium-scale rice farmers in Africa. However, to better develop any postharvest technology, knowledge of the mechanical properties of different varieties of paddy rice is vital. There is also the issue of the development of new rice cultivars. The objectives of this research are to (1) determine the mechanical properties of the selected paddy rice varieties at varying moisture content. (2) conduct a comparative analysis of the mechanical properties of selected rice paddy for different variety-moisture content interactions. (3) determine the significant statistical differences between the mean values of the various variety-moisture content interactions The mechanical properties of AGRA rice, CRI-Amankwatia, CRI-Enapa and CRI-Dartey, four local varieties developed by Crop Research Institute of Ghana are compared at 11.5%, 13.0% and 16.5% dry basis moisture content. The mechanical properties measured are Sphericity, Aspect ratio, Grain mass, 1000 Grain mass, Bulk Density, True Density, Porosity and Angle of Repose. Samples were collected from the Kwadaso Agric College of the CRI in Kumasi. The samples were threshed manually and winnowed before conducting the experiment. The moisture content was determined on a dry basis using the Moistex Screw-Type Digital Grain Moisture Meter. Other equipment used for data collection were venire calipers and Citizen electronic scale. A 4×3 factorial arrangement was used in a completely randomized design in three replications. Tukey's HSD comparisons test was conducted during data analysis to compare all possible pairwise combinations of the various varieties’ moisture content interaction. From the results, it was concluded that Sphericity recorded 0.391 mm³ to 0.377 mm³ for CRI-Dartey at 16.5% and CRI-Enapa at 13.5%, respectively, whereas Aspect Ratio recorded 0.298 mm³ to 0.269 mm³ for CRI-Dartey at 16.5% and CRI-Enapa at 13.5% respectively. For grain mass, AGRA rice at 13.0% also recorded 0.0312 g as the highest score and CRI-Enapa at 13.0% obtained 0.0237 as the lowest score. For the GM1000, it was observed that it ranges from 29.33 g for CRI-Amankwatia at 16.5% moisture content to 22.54 g for CRI-Enapa at 16.5% interactions. Bulk Density ranged from 654.0 kg/m³ to 422.9 kg/m³ for CRI-Amankwatia at 16.5% and CRI-Enapa at 11.5% as the highest and lowest recordings, respectively. It was also observed that the true Density ranges from 1685.8 kg/m3 for AGRA rice at 13.0% moisture content to 1352.5 kg/m³ for CRI-Enapa at 16.5% interactions. In the case of porosity, CRI-Enapa at 11.5% received the highest score of 70.83% and CRI-Amankwatia at 16.5 received the lowest score of 55.88%. Finally, in the case of Angle of Repose, CRI-Amankwatia at 16.5% recorded the highest score of 47.3o and CRI-Enapa at 11.5% recorded the least score of 34.27o. In all cases, the difference in mean value was less than the LSD. This indicates that there were no significant statistical differences between their mean values, indicating that technologies developed and adapted for one variety can equally be used for all the other varieties.Keywords: angle of repose, aspect ratio, bulk density, porosity, sphericity, mechanical properties
Procedia PDF Downloads 98111 Impact of Blended Learning in Interior Architecture Programs in Academia: A Case Study of Arcora Garage Academy from Turkey
Authors: Arzu Firlarer, Duygu Gocmen, Gokhan Uysal
Abstract:
There is currently a growing trend among universities towards blended learning. Blended learning is becoming increasingly important in higher education, with the aims of better accomplishing course learning objectives, meeting students’ changing needs and promoting effective learning both in a theoretical and practical dimension like interior architecture discipline. However, the practical dimension of the discipline cannot be supported in the university environment. During the undergraduate program, the practical training which is tried to be supported by two different internship programs cannot fully meet the requirements of the blended learning. The lack of education program frequently expressed by our graduates and employers is revealed in the practical knowledge and skills dimension of the profession. After a series of meetings for curriculum studies, interviews with the chambers of profession, meetings with interior architects, a gap between the theoretical and practical training modules is seen as a problem in all interior architecture departments. It is thought that this gap can be solved by a new education model which is formed by the cooperation of University-Industry in the concept of blended learning. In this context, it is considered that theoretical and applied knowledge accumulation can be provided by the creation of industry-supported educational environments at the university. In the application process of the Interior Architecture discipline, the use of materials and technical competence will only be possible with the cooperation of industry and participation of students in the production/manufacture processes as observers and practitioners. Wood manufacturing is an important part of interior architecture applications. Wood productions is a sustainable structural process where production details, material knowledge, and process details can be observed in the most effective way. From this point of view, after theoretical training about wooden materials, wood applications and production processes are given to the students, practical training for production/manufacture planning is supported by active participation and observation in the processes. With this blended model, we aimed to develop a training model in which theoretical and practical knowledge related to the production of wood works will be conveyed in a meaningful, lasting way by means of university-industry cooperation. The project is carried out in Ankara with Arcora Architecture and Furniture Company and Başkent University Department of Interior Design where university-industry cooperation is realized. Within the scope of the project, every week the video of that week’s lecture is recorded and prepared to be disseminated by digital medias such as Udemy. In this sense, the program is not only developed by the project participants, but also other institutions and people who are trained and practiced in the field of design. Both academicians from University and at least 15-year experienced craftsmen in the wood metal and dye sectors are preparing new training reference documents for interior architecture undergraduate programs. These reference documents will be a model for other Interior Architecture departments of the universities and will be used for creating an online education module.Keywords: blended learning, interior design, sustainable training, effective learning.
Procedia PDF Downloads 136110 The Use of Flipped Classroom as a Teaching Method in a Professional Master's Program in Network, in Brazil
Authors: Carla Teixeira, Diana Azevedo, Jonatas Bessa, Maria Guilam
Abstract:
The flipped classroom is a blended learning modality that combines face-to-face and virtual activities of self-learning, mediated by digital information and communication technologies, which reverses traditional teaching approaches and presents, as a presupposition, the previous study of contents by students. In the following face-to-face activities, the contents are discussed, producing active learning. This work aims to describe the systematization process of the use of flipped classrooms as a method to develop complementary national activities in PROFSAÚDE, a professional master's program in the area of public health, offered as a distance learning course, in the network, in Brazil. The complementary national activities were organized with the objective of strengthening and qualifying students´ learning process. The network gathers twenty-two public institutions of higher education in the country. Its national coordination conducted a survey to detect complementary educational needs, supposed to improve the formative process and align important content sums for the program nationally. The activities were organized both asynchronously, making study materials available in Google classrooms, and synchronously in a tele presential way, organized on virtual platforms to reach the largest number of students in the country. The asynchronous activities allowed each student to study at their own pace and the synchronous activities were intended for deepening and reflecting on the themes. The national team identified some professors' areas of expertise, who were contacted for the production of audiovisual content such as video classes and podcasts, guidance for supporting bibliographic materials and also to conduct synchronous activities together with the technical team. The contents posted in the virtual classroom were organized by modules and made available before the synchronous meeting; these modules, in turn, contain “pills of experience” that correspond to reports of teachers' experiences in relation to the different themes. In addition, activity was proposed, with questions aimed to expose doubts about the contents and a learning challenge, as a practical exercise. Synchronous activities are built with different invited teachers, based on the participants 'discussions, and are the forum where teachers can answer students' questions, providing feedback on the learning process. At the end of each complementary activity, an evaluation questionnaire is available. The responses analyses show that this institutional network experience, as pedagogical innovation, provides important tools to support teaching and research due to its potential in the participatory construction of learning, optimization of resources, the democratization of knowledge and sharing and strengthening of practical experiences on the network. One of its relevant aspects was the thematic diversity addressed through this method.Keywords: active learning, flipped classroom, network education experience, pedagogic innovation
Procedia PDF Downloads 159109 Temporal Changes Analysis (1960-2019) of a Greek Rural Landscape
Authors: Stamatia Nasiakou, Dimitrios Chouvardas, Michael Vrahnakis, Vassiliki Kleftoyanni
Abstract:
Recent research in the mountainous and semi-mountainous rural landscapes of Greece shows that they have been significantly changed over the last 80 years. These changes have the form of structural modification of land cover/use patterns, with the main characteristic being the extensive expansion of dense forests and shrubs at the expense of grasslands and extensive agricultural areas. The aim of this research was to study the 60-year changes (1960-2019) of land cover/ use units in the rural landscape of Mouzaki (Karditsa Prefecture, central Greece). Relevant cartographic material such as forest land use maps, digital maps (Corine Land Cover -2018), 1960 aerial photos from Hellenic Military Geographical Service, and satellite imagery (Google Earth Pro 2014, 2016, 2017 and 2019) was collected and processed in order to study landscape evolution. ArcGIS v 10.2.2 software was used to process the cartographic material and to produce several sets of data. Main product of the analysis was a digitized photo-mosaic of the 1960 aerial photographs, a digitized photo-mosaic of recent satellite images (2014, 2016, 2017 and 2019), and diagrams and maps of temporal transformation of the rural landscape (1960 – 2019). Maps and diagrams were produced by applying photointerpretation techniques and a suitable land cover/ use classification system on the two photo-mosaics. Demographic and socioeconomic inventory data was also collected mainly from diachronic census reports of the Hellenic Statistical Authority and local sources. Data analysis of the temporal transformation of land cover/ use units showed that they are mainly located in the central and south-eastern part of the study area, which mainly includes the mountainous part of the landscape. The most significant change is the expansion of the dense forests that currently dominate the southern and eastern part of the landscape. In conclusion, the produced diagrams and maps of the land cover/ use evolution suggest that woody vegetation in the rural landscape of Mouzaki has significantly increased over the past 60 years at the expense of the open areas, especially grasslands and agricultural areas. Demographic changes, land abandonment and the transformation of traditional farming practices (e.g. agroforestry) were recognized as the main cause of the landscape change. This study is part of a broader research project entitled “Perspective of Agroforestry in Thessaly region: A research on social, environmental and economic aspects to enhance farmer participation”. The project is funded by the General Secretariat for Research and Technology (GSRT) and the Hellenic Foundation for Research and Innovation (HFRI).Keywords: Agroforestry, Forest expansion, Land cover/ use changes, Mountainous and semi-mountainous areas
Procedia PDF Downloads 107108 Nursing Education in the Pandemic Time: Case Study
Authors: Jaana Sepp, Ulvi Kõrgemaa, Kristi Puusepp, Õie Tähtla
Abstract:
COVID-19 was officially recognized as a pandemic in late 2019 by the WHO, and it has led to changes in the education sector. Educational institutions were closed, and most schools adopted distance learning. Estonia is known as a digitally well-developed country. Based on that, in the pandemic time, nursing education continued, and new technological solutions were implemented. To provide nursing education, special focus was paid on quality and flexibility. The aim of this paper is to present administrative, digital, and technological solutions which support Estonian nursing educators to continue the study process in the pandemic time and to develop a sustainable solution for nursing education for the future. This paper includes the authors’ analysis of the documents and decisions implemented in the institutions through the pandemic time. It is a case study of Estonian nursing educators. Results of the analysis show that the implementation of distance learning principles challenges the development of innovative strategies and technics for the assessment of student performance and educational outcomes and implement new strategies to encourage student engagement in the virtual classroom. Additionally, hospital internships were canceled, and the simulation approach was deeply implemented as a new opportunity to develop and assess students’ practical skills. There are many other technical and administrative changes that have also been carried out, such as students’ support and assessment systems, the designing and conducting of hybrid and blended studies, etc. All services were redesigned and made more available, individual, and flexible. Hence, the feedback system was changed, the information was collected in parallel with educational activities. Experiences of nursing education during the pandemic time are widely presented in scientific literature. However, to conclude our study, authors have found evidence that solutions implemented in Estonian nursing education allowed the students to graduate within the nominal study period without any decline in education quality. Operative information system and flexibility provided the minimum distance between the students, support, and academic staff, and likewise, the changes were implemented quickly and efficiently. Institution memberships were updated with the appropriate information, and it positively affected their satisfaction, motivation, and commitment. We recommend that the feedback process and the system should be permanently changed in the future to place all members in the same information area, redefine the hospital internship process, implement hybrid learning, as well as to improve the communication system between stakeholders inside and outside the organization. The main limitation of this study relates to the size of Estonia. Nursing education is provided by two institutions only, and similarly, the number of students is low. The result could be generated to the institutions with a similar size and administrative system. In the future, the relationship between nurses’ performance and organizational outcomes should be deeply investigated and influences of the pandemic time education analyzed at workplaces.Keywords: hybrid learning, nursing education, nursing, COVID-19
Procedia PDF Downloads 120107 Characterization of Alloyed Grey Cast Iron Quenched and Tempered for a Smooth Roll Application
Authors: Mohamed Habireche, Nacer E. Bacha, Mohamed Djeghdjough
Abstract:
In the brick industry, smooth double roll crusher is used for medium and fine crushing of soft to medium hard material. Due to opposite inward rotation of the rolls, the feed material is nipped between the rolls and crushed by compression. They are subject to intense wear, known as three-body abrasion, due to the action of abrasive products. The production downtime affecting productivity stems from two sources: the bi-monthly rectification of the roll crushers and their replacement when they are completely worn out. Choosing the right material for the roll crushers should result in longer machine cycles, and reduced repair and maintenance costs. All roll crushers are imported from outside Algeria. This results in sometimes very long delivery times which handicap the brickyards, in particular in respecting delivery times and honored the orders made by customers. The aim of this work is to investigate the effect of alloying additions on microstructure and wear behavior of grey lamellar cast iron for smooth roll crushers in brick industry. The base gray iron was melted in an induction furnace with low frequency at a temperature of 1500 °C, in which return cast iron scrap, new cast iron ingot, and steel scrap were added to the melt to generate the desired composition. The chemical analysis of the bar samples was carried out using Emission Spectrometer Systems PV 8050 Series (Philips) except for the carbon, for which a carbon/sulphur analyser Elementrac CS-i was used. Unetched microstructure was used to evaluate the graphite flake morphology using the image comparison measurement method. At least five different fields were selected for quantitative estimation of phase constituents. The samples were observed under X100 magnification with a Zeiss Axiover T40 MAT optical microscope equipped with a digital camera. SEM microscope equipped with EDS was used to characterize the phases present in the microstructure. The hardness (750 kg load, 5mm diameter ball) was measured with a Brinell testing machine for both treated and as-solidified condition test pieces. The test bars were used for tensile strength and metallographic evaluations. Mechanical properties were evaluated using tensile specimens made as per ASTM E8 standards. Two specimens were tested for each alloy. From each rod, a test piece was made for the tensile test. The results showed that the quenched and tempered alloys had best wear resistance at 400 °C for alloyed grey cast iron (containing 0.62%Mn, 0.68%Cr, and 1.09% Cu) due to fine carbides in the tempered matrix. In quenched and tempered condition, increasing Cu content in cast irons improved its wear resistance moderately. Combined addition of Cu and Cr increases hardness and wear resistance for a quenched and tempered hypoeutectic grey cast iron.Keywords: casting, cast iron, microstructure, heat treating
Procedia PDF Downloads 105106 3D Design of Orthotic Braces and Casts in Medical Applications Using Microsoft Kinect Sensor
Authors: Sanjana S. Mallya, Roshan Arvind Sivakumar
Abstract:
Orthotics is the branch of medicine that deals with the provision and use of artificial casts or braces to alter the biomechanical structure of the limb and provide support for the limb. Custom-made orthoses provide more comfort and can correct issues better than those available over-the-counter. However, they are expensive and require intricate modelling of the limb. Traditional methods of modelling involve creating a plaster of Paris mould of the limb. Lately, CAD/CAM and 3D printing processes have improved the accuracy and reduced the production time. Ordinarily, digital cameras are used to capture the features of the limb from different views to create a 3D model. We propose a system to model the limb using Microsoft Kinect2 sensor. The Kinect can capture RGB and depth frames simultaneously up to 30 fps with sufficient accuracy. The region of interest is captured from three views, each shifted by 90 degrees. The RGB and depth data are fused into a single RGB-D frame. The resolution of the RGB frame is 1920px x 1080px while the resolution of the Depth frame is 512px x 424px. As the resolution of the frames is not equal, RGB pixels are mapped onto the Depth pixels to make sure data is not lost even if the resolution is lower. The resulting RGB-D frames are collected and using the depth coordinates, a three dimensional point cloud is generated for each view of the Kinect sensor. A common reference system was developed to merge the individual point clouds from the Kinect sensors. The reference system consisted of 8 coloured cubes, connected by rods to form a skeleton-cube with the coloured cubes at the corners. For each Kinect, the region of interest is the square formed by the centres of the four cubes facing the Kinect. The point clouds are merged by considering one of the cubes as the origin of a reference system. Depending on the relative distance from each cube, the three dimensional coordinate points from each point cloud is aligned to the reference frame to give a complete point cloud. The RGB data is used to correct for any errors in depth data for the point cloud. A triangular mesh is generated from the point cloud by applying Delaunay triangulation which generates the rough surface of the limb. This technique forms an approximation of the surface of the limb. The mesh is smoothened to obtain a smooth outer layer to give an accurate model of the limb. The model of the limb is used as a base for designing the custom orthotic brace or cast. It is transferred to a CAD/CAM design file to design of the brace above the surface of the limb. The proposed system would be more cost effective than current systems that use MRI or CT scans for generating 3D models and would be quicker than using traditional plaster of Paris cast modelling and the overall setup time is also low. Preliminary results indicate that the accuracy of the Kinect2 is satisfactory to perform modelling.Keywords: 3d scanning, mesh generation, Microsoft kinect, orthotics, registration
Procedia PDF Downloads 190