Search results for: semantic web technologies
2815 Determinants of Carbon-Certified Small-Scale Agroforestry Adoption In Rural Mount Kenyan
Authors: Emmanuel Benjamin, Matthias Blum
Abstract:
Purpose – We address smallholder farmers’ restricted possibilities to adopt sustainable technologies which have direct and indirect benefits. Smallholders often face little asset endowment due to small farm size und insecure property rights, therefore experiencing constraints in adopting agricultural innovation. A program involving payments for ecosystem services (PES) benefits poor smallholder farmers in developing countries in many ways and has been suggested as a means of easing smallholder farmers’ financial constraints. PES may also provide additional mainstay which can eventually result in more favorable credit contract terms due to the availability of collateral substitute. Results of this study may help to understand the barriers, motives and incentives for smallholders’ participation in PES and help in designing a strategy to foster participation in beneficial programs. Design/methodology/approach – This paper uses a random utility model and a logistic regression approach to investigate factors that influence agroforestry adoption. We investigate non-monetary factors, such as information spillover, that influence the decision to adopt such conservation strategies. We collected original data from non-government-run agroforestry mitigation programs with PES that have been implemented in the Mount Kenya region. Preliminary Findings – We find that spread of information, existing networks and peer involvement in such programs drive participation. Conversely, participation by smallholders does not seem to be influenced by education, land or asset endowment. Contrary to some existing literature, we found weak evidence for a positive correlation between the adoption of agroforestry with PES and age of smallholder, e.g., one increases with the other, in the Mount Kenyan region. Research implications – Poverty alleviation policies for developing countries should target social capital to increase the adoption rate of modern technologies amongst smallholders.Keywords: agriculture innovation, agroforestry adoption, smallholders, payment for ecosystem services, Sub-Saharan Africa
Procedia PDF Downloads 3812814 Arts and Cultural Heritage Digitalization in Nigeria: Problems and Prospects
Authors: Okechukwu Uzoma Nkwocha, Edward Uche Omeire
Abstract:
Information and communication technologies (ICT) undeniably, have expanded the sphere of arts and creativity. It proves to be an important tool for production, preservation, sharing and utilization of arts and cultural heritage. While art and heritage institutions around the globe are increasingly utilizing ICT for the promotion and sharing of their collections, the story seems different in most part of Africa. In this paper, we will examine the prospects and problems of utilizing ICT in promotion, preservation and sharing of arts and cultural heritage.Keywords: arts, cultural heritage, digitalization, ICT
Procedia PDF Downloads 1922813 From Makers to Maker Communities: A Survey on Turkish Makerspaces
Authors: Dogan Can Hatunoglu, Cengiz Hakan Gurkanlı, Hatice Merve Demirci
Abstract:
Today, the maker movement is regarded as a socio-cultural movement that represents designing and building objects for innovations. In these creativity-based activities of the movement, individuals from different backgrounds such as; inventors, programmers, craftspeople, DIY’ers, tinkerers, engineers, designers, and hackers, form a community and work collaboratively for mutual, open-source innovations. Today, with the accessibility of recently emerged technologies and digital fabrication tools, the Maker Movement is continuously expanding its scope and has evolved into a new experience, and for many, it is now considered as new kind of industrial revolution. In this new experience, makers create new things within their community by using new digital tools and technologies in spots called makerspaces. In these makerspaces, activities of learning, experience sharing, and mentoring are evolved into maker events. Makers who share common interests in making benefit from makerspaces as meeting and working spots. In literature, there are many sources on Maker Movement, maker communities, and their activities, especially in the field of business administration. However, there is a gap in the literature about the maker communities in Turkey. This research aims to be an information source on the dynamics and process design of “making” activities in Turkish maker communities and also aims to provide insights to sustain and enhance local maker communities in the future. Within this aim, semi-structured interviews were conducted with founders and facilitators from selected Turkish maker communities. (1) The perception towards Maker Movement, makers, activity of making, and current situation of maker communities, (2) motivations of individuals who participate the maker communities, and (3) key drivers (collaboration and decision-making in design processes) of maker activities from the perspectives of main actors (founders, facilitators) are all examined deeply with question on personal experiences and perspectives. After a qualitative approached data analysis concerning the maker communities in Turkey, this research reveals that there are two main conclusions regarding (1) the foundation of the Turkish maker mindset and (2) emergence of self-sustaining communities.Keywords: Maker Movement, maker community, makerspaces, open-source design, sustainability
Procedia PDF Downloads 1442812 Factors Affecting Test Automation Stability and Their Solutions
Authors: Nagmani Lnu
Abstract:
Test automation is a vital requirement of any organization to release products faster to their customers. In most cases, an organization has an approach to developing automation but struggles to maintain it. It results in an increased number of Flaky Tests, reducing return on investments and stakeholders’ confidence. Challenges grow in multiple folds when automation is for UI behaviors. This paper describes the approaches taken to identify the root cause of automation instability in an extensive payments application and the best practices to address that using processes, tools, and technologies, resulting in a 75% reduction of effort.Keywords: automation stability, test stability, Flaky Test, test quality, test automation quality
Procedia PDF Downloads 842811 The Impact of Virtual Learning Strategy on Youth Learning Motivation in Malaysian Higher Learning Instituitions
Authors: Hafizah Harun, Habibah Harun, Azlina Kamaruddin
Abstract:
Virtual reality has become a powerful and promising tool in education because of their unique technological characteristics that differentiate them from the other ICT applications. Despite the numerous interpretations of its definition, virtual reality can be concisely and precisely described as the integration of computer graphics and various input and display technologies to create the illusion of immersion in a computer generated reality. Generally, there are two major types based on the level of interaction and immersive environment that are immersive and non-immersive virtual reality. In the study of the role of virtual reality in built environment education, Horne and Thompson were reported as saying that the benefits of using visualization technologies were seen as having the potential to improve and extend the learning process, increase student motivation and awareness, and add to the diversity of teaching methods. Youngblut reported that students enjoy working with virtual worlds and this experience can be highly motivating. The impact of virtual reality on youth learning in Malaysia is currently not well explored because the technology is still not widely used here. Only a handful of the universities, such as University Malaya, MMU, and Unimas are applying virtual reality strategy in some of their undergraduate programs. From the literature, it has been identified that there are several virtual reality learning strategies currently available. Therefore, this study aims to investigate the impact of Virtual Reality strategy on Youth Learning Motivation in Malaysian higher learning institutions. We will explore the relationship between virtual reality (gaming, laboratory, simulation) and youth leaning motivation. Another aspect that we will explore is the framework for virtual reality implementation at higher learning institution in Malaysia. This study will be carried out quantitatively by distributing questionnaires to respondents from sample universities. Data analysis are descriptive and multiple regression. Researcher will carry out a pilot test prior to distributing the questionnaires to 300 undergraduate students who are undergoing their courses in virtual reality environment. The respondents come from two universities, MMU CyberJaya and University Malaya. The expected outcomes from this study are the identification of which virtual reality strategy has most impact on students’ motivation in learning and a proposed framework of virtual reality implementation at higher learning.Keywords: virtual reality, learning strategy, youth learning, motivation
Procedia PDF Downloads 3892810 Using Serious Games to Integrate the Potential of Mass Customization into the Fuzzy Front-End of New Product Development
Authors: Michael N. O'Sullivan, Con Sheahan
Abstract:
Mass customization is the idea of offering custom products or services to satisfy the needs of each individual customer while maintaining the efficiency of mass production. Technologies like 3D printing and artificial intelligence have many start-ups hoping to capitalize on this dream of creating personalized products at an affordable price, and well established companies scrambling to innovate and maintain their market share. However, the majority of them are failing as they struggle to understand one key question – where does customization make sense? Customization and personalization only make sense where the value of the perceived benefit outweighs the cost to implement it. In other words, will people pay for it? Looking at the Kano Model makes it clear that it depends on the product. In products where customization is an inherent need, like prosthetics, mass customization technologies can be highly beneficial. However, for products that already sell as a standard, like headphones, offering customization is likely only an added bonus, and so the product development team must figure out if the customers’ perception of the added value of this feature will outweigh its premium price tag. This can be done through the use of a ‘serious game,’ whereby potential customers are given a limited budget to collaboratively buy and bid on potential features of the product before it is developed. If the group choose to buy customization over other features, then the product development team should implement it into their design. If not, the team should prioritize the features on which the customers have spent their budget. The level of customization purchased can also be translated to an appropriate production method, for example, the most expensive type of customization would likely be free-form design and could be achieved through digital fabrication, while a lower level could be achieved through short batch production. Twenty-five teams of final year students from design, engineering, construction and technology tested this methodology when bringing a product from concept through to production specification, and found that it allowed them to confidently decide what level of customization, if any, would be worth offering for their product, and what would be the best method of producing it. They also found that the discussion and negotiations between players during the game led to invaluable insights, and often decided to play a second game where they offered customers the option to buy the various customization ideas that had been discussed during the first game.Keywords: Kano model, mass customization, new product development, serious game
Procedia PDF Downloads 1342809 Impact of the Non-Energy Sectors Diversification on the Energy Dependency Mitigation: Visualization by the “IntelSymb” Software Application
Authors: Ilaha Rzayeva, Emin Alasgarov, Orkhan Karim-Zada
Abstract:
This study attempts to consider the linkage between management and computer sciences in order to develop the software named “IntelSymb” as a demo application to prove data analysis of non-energy* fields’ diversification, which will positively influence on energy dependency mitigation of countries. Afterward, we analyzed 18 years of economic fields of development (5 sectors) of 13 countries by identifying which patterns mostly prevailed and which can be dominant in the near future. To make our analysis solid and plausible, as a future work, we suggest developing a gateway or interface, which will be connected to all available on-line data bases (WB, UN, OECD, U.S. EIA) for countries’ analysis by fields. Sample data consists of energy (TPES and energy import indicators) and non-energy industries’ (Main Science and Technology Indicator, Internet user index, and Sales and Production indicators) statistics from 13 OECD countries over 18 years (1995-2012). Our results show that the diversification of non-energy industries can have a positive effect on energy sector dependency (energy consumption and import dependence on crude oil) deceleration. These results can provide empirical and practical support for energy and non-energy industries diversification’ policies, such as the promoting of Information and Communication Technologies (ICTs), services and innovative technologies efficiency and management, in other OECD and non-OECD member states with similar energy utilization patterns and policies. Industries, including the ICT sector, generate around 4 percent of total GHG, but this is much higher — around 14 percent — if indirect energy use is included. The ICT sector itself (excluding the broadcasting sector) contributes approximately 2 percent of global GHG emissions, at just under 1 gigatonne of carbon dioxide equivalent (GtCO2eq). Ergo, this can be a good example and lesson for countries which are dependent and independent on energy, and mainly emerging oil-based economies, as well as to motivate non-energy industries diversification in order to be ready to energy crisis and to be able to face any economic crisis as well.Keywords: energy policy, energy diversification, “IntelSymb” software, renewable energy
Procedia PDF Downloads 2242808 Network Conditioning and Transfer Learning for Peripheral Nerve Segmentation in Ultrasound Images
Authors: Harold Mauricio Díaz-Vargas, Cristian Alfonso Jimenez-Castaño, David Augusto Cárdenas-Peña, Guillermo Alberto Ortiz-Gómez, Alvaro Angel Orozco-Gutierrez
Abstract:
Precise identification of the nerves is a crucial task performed by anesthesiologists for an effective Peripheral Nerve Blocking (PNB). Now, anesthesiologists use ultrasound imaging equipment to guide the PNB and detect nervous structures. However, visual identification of the nerves from ultrasound images is difficult, even for trained specialists, due to artifacts and low contrast. The recent advances in deep learning make neural networks a potential tool for accurate nerve segmentation systems, so addressing the above issues from raw data. The most widely spread U-Net network yields pixel-by-pixel segmentation by encoding the input image and decoding the attained feature vector into a semantic image. This work proposes a conditioning approach and encoder pre-training to enhance the nerve segmentation of traditional U-Nets. Conditioning is achieved by the one-hot encoding of the kind of target nerve a the network input, while the pre-training considers five well-known deep networks for image classification. The proposed approach is tested in a collection of 619 US images, where the best C-UNet architecture yields an 81% Dice coefficient, outperforming the 74% of the best traditional U-Net. Results prove that pre-trained models with the conditional approach outperform their equivalent baseline by supporting learning new features and enriching the discriminant capability of the tested networks.Keywords: nerve segmentation, U-Net, deep learning, ultrasound imaging, peripheral nerve blocking
Procedia PDF Downloads 1062807 On Privacy-Preserving Search in the Encrypted Domain
Authors: Chun-Shien Lu
Abstract:
Privacy-preserving query has recently received considerable attention in the signal processing and multimedia community. It is also a critical step in wireless sensor network for retrieval of sensitive data. The purposes of privacy-preserving query in both the areas of signal processing and sensor network are the same, but the similarity and difference of the adopted technologies are not fully explored. In this paper, we first review the recently developed methods of privacy-preserving query, and then describe in a comprehensive manner what we can learn from the mutual of both areas.Keywords: encryption, privacy-preserving, search, security
Procedia PDF Downloads 2562806 Written Narrative Texts as the Indicators of Communication Competence of Pupils and Students with Hearing Impairment in the Czech Language
Authors: Marie Komorna, Katerina Hadkova
Abstract:
One reason why hearing disabilities as compared to other disabilities are considered to be less serious, is the belief that deaf and hard of hearing persons can read and write without problems and can therefore fairly easily compensate for problems related to their limited ability to hear sound. However in reality this is not the case, especially as regards written Czech, deaf persons are often not able to communicate their message clearly to its recipients. Their inability to communicate fully in written language is one of the most severe problems facing a number of deaf persons, a problem which they face and which makes it difficult for them to function in a sound-based environment. Despite this fact, this issue is one which has been given only a minimum of attention in the Czech Republic. That is why we decided to focus our research on this issue, specifically targeting written communication of deaf pupils in primary and secondary schools. The paper summarizes the background and objectives of this research. The written work of deaf respondents was obtained in response to a narrative based on a series of images which depicted a continuous storyline. Based on an analysis of the obtained written work we tried to describe the specifics of the narrative abilities of the deaf authors of these texts. We also analyzed other aspects and specific traits of text written by deaf authors at a phonetic-phonological, lexical-semantic, morphological and syntactic, respectively pragmatic level. Based on the results of the project it will be possible to increase knowledge of the communication abilities of deaf persons in written Czech. The obtained data may be used during future research and for teaching purposes and/or education concepts for teaching Czech to deaf pupils.Keywords: communication competence, deaf, narrative, written texts
Procedia PDF Downloads 3382805 Kinematic Analysis of the Calf Raise Test Using a Mobile iOS Application: Validation of the Calf Raise Application
Authors: Ma. Roxanne Fernandez, Josie Athens, Balsalobre-Fernandez, Masayoshi Kubo, Kim Hébert-Losier
Abstract:
Objectives: The calf raise test (CRT) is used in rehabilitation and sports medicine to evaluate calf muscle function. For testing, individuals stand on one leg and go up on their toes and back down to volitional fatigue. The newly developed Calf Raise application (CRapp) for iOS uses computer-vision algorithms enabling objective measurement of CRT outcomes. We aimed to validate the CRapp by examining its concurrent validity and agreement levels against laboratory-based equipment and establishing its intra- and inter-rater reliability. Methods: CRT outcomes (i.e., repetitions, positive work, total height, peak height, fatigue index, and peak power) were assessed in thirteen healthy individuals (6 males, 7 females) on three occasions and both legs using the CRapp, 3D motion capture, and force plate technologies simultaneously. Data were extracted from two markers: one placed immediately below the lateral malleolus and another on the heel. Concurrent validity and agreement measures were determined using intraclass correlation coefficients (ICC₃,ₖ), typical errors expressed as coefficient of variations (CV), and Bland-Altman methods to assess biases and precision. Reliability was assessed using ICC3,1 and CV values. Results: Validity of CRapp outcomes was good to excellent across measures for both markers (mean ICC ≥0.878), with precision plots showing good agreement and precision. CV ranged from 0% (repetitions) to 33.3% (fatigue index) and were, on average better for the lateral malleolus marker. Additionally, inter- and intra-rater reliability were excellent (mean ICC ≥0.949, CV ≤5.6%). Conclusion: These results confirm the CRapp is valid and reliable within and between users for measuring CRT outcomes in healthy adults. The CRapp provides a tool to objectivise CRT outcomes in research and practice, aligning with recent advances in mobile technologies and their increased use in healthcare.Keywords: calf raise test, mobile application, validity, reliability
Procedia PDF Downloads 1652804 From Industry 4.0 to Agriculture 4.0: A Framework to Manage Product Data in Agri-Food Supply Chain for Voluntary Traceability
Authors: Angelo Corallo, Maria Elena Latino, Marta Menegoli
Abstract:
Agri-food value chain involves various stakeholders with different roles. All of them abide by national and international rules and leverage marketing strategies to advance their products. Food products and related processing phases carry with it a big mole of data that are often not used to inform final customer. Some data, if fittingly identified and used, can enhance the single company, and/or the all supply chain creates a math between marketing techniques and voluntary traceability strategies. Moreover, as of late, the world has seen buying-models’ modification: customer is careful on wellbeing and food quality. Food citizenship and food democracy was born, leveraging on transparency, sustainability and food information needs. Internet of Things (IoT) and Analytics, some of the innovative technologies of Industry 4.0, have a significant impact on market and will act as a main thrust towards a genuine ‘4.0 change’ for agriculture. But, realizing a traceability system is not simple because of the complexity of agri-food supply chain, a lot of actors involved, different business models, environmental variations impacting products and/or processes, and extraordinary climate changes. In order to give support to the company involved in a traceability path, starting from business model analysis and related business process a Framework to Manage Product Data in Agri-Food Supply Chain for Voluntary Traceability was conceived. Studying each process task and leveraging on modeling techniques lead to individuate information held by different actors during agri-food supply chain. IoT technologies for data collection and Analytics techniques for data processing supply information useful to increase the efficiency intra-company and competitiveness in the market. The whole information recovered can be shown through IT solutions and mobile application to made accessible to the company, the entire supply chain and the consumer with the view to guaranteeing transparency and quality.Keywords: agriculture 4.0, agri-food suppy chain, industry 4.0, voluntary traceability
Procedia PDF Downloads 1472803 Exploring Digital Media’s Impact on Sports Sponsorship: A Global Perspective
Authors: Sylvia Chan-Olmsted, Lisa-Charlotte Wolter
Abstract:
With the continuous proliferation of media platforms, there have been tremendous changes in media consumption behaviors. From the perspective of sports sponsorship, while there is now a multitude of platforms to create brand associations, the changing media landscape and shift of message control also mean that sports sponsors will have to take into account the nature of and consumer responses toward these emerging digital media to devise effective marketing strategies. Utilizing the personal interview methodology, this study is qualitative and exploratory in nature. A total of 18 experts from European and American academics, sports marketing industry, and sports leagues/teams were interviewed to address three main research questions: 1) What are the major changes in digital technologies that are relevant to sports sponsorship; 2) How have digital media influenced the channels and platforms of sports sponsorship; and 3) How have these technologies affected the goals, strategies, and measurement of sports sponsorship. The study found that sports sponsorship has moved from consumer engagement, engagement measurement, and consequences of engagement on brand behaviors to micro-targeting one on one, engagement by context, time, and space, and activation and leveraging based on tracking and databases. From the perspective of platforms and channels, the use of mobile devices is prominent during sports content consumption. Increasing multiscreen media consumption means that sports sponsors need to optimize their investment decisions in leagues, teams, or game-related content sources, as they need to go where the fans are most engaged in. The study observed an imbalanced strategic leveraging of technology and digital infrastructure. While sports leagues have had less emphasis on brand value management via technology, sports sponsors have been much more active in utilizing technologies like mobile/LBS tools, big data/user info, real-time marketing and programmatic, and social media activation. Regardless of the new media/platforms, the study found that integration and contextualization are the two essential means of improving sports sponsorship effectiveness through technology. That is, how sponsors effectively integrate social media/mobile/second screen into their existing legacy media sponsorship plan so technology works for the experience/message instead of distracting fans. Additionally, technological advancement and attention economy amplify the importance of consumer data gathering, but sports consumer data does not mean loyalty or engagement. This study also affirms the benefit of digital media as they offer viral and pre-event activations through storytelling way before the actual event, which is critical for leveraging brand association before and after. That is, sponsors now have multiple opportunities and platforms to tell stories about their brands for longer time period. In summary, digital media facilitate fan experience, access to the brand message, multiplatform/channel presentations, storytelling, and content sharing. Nevertheless, rather than focusing on technology and media, today’s sponsors need to define what they want to focus on in terms of content themes that connect with their brands and then identify the channels/platforms. The big challenge for sponsors is to play to the venues/media’s specificity and its fit with the target audience and not uniformly deliver the same message in the same format on different platforms/channels.Keywords: digital media, mobile media, social media, technology, sports sponsorship
Procedia PDF Downloads 2942802 A Linguistic Product of K-Pop: A Corpus-Based Study on the Korean-Originated Chinese Neologism Simida
Authors: Hui Shi
Abstract:
This article examines the online popularity of Chinese neologism simida, which is a loanword derived from Korean declarative sentence-final suffix seumnida. Facilitated by corpus data obtained from Weibo, the Chinese counterpart of Twitter, this study analyzes the morphological and syntactical processes behind simida’s coinage, as well as the causes of its prevalence on Chinese social media. The findings show that simida is used by Weibo bloggers in two manners: (1) as an alternative word of 'Korea' and 'Korean'; (2) as a redundant sentence-final particle which adds a Korean-like speech style to a statement. Additionally, Weibo user profile analysis further reveals demographical distribution patterns concerning this neologism and highlights young Weibo users in the third-tier cities as the leading adopters of simida. These results are accounted for under the theoretical framework of social indexicality, especially how variations generate style in the indexical field. This article argues that the creation of such an ethnically-targeted neologism is a linguistic demonstration of Chinese netizen’s two-sided attitudes toward the previously heated Korean-wave. The exotic suffix seumnida is borrowed to Chinese as simida due to its high-frequency in Korean cultural exports. Therefore, it gradually becomes a replacement of Korea-related lexical items due to markedness, regardless of semantic prosody. Its innovative implantation to Chinese syntax, on the other hand, reflects Chinese netizens’ active manipulation of language for their online identity building. This study has implications for research on the linguistic construction of identity and style and lays the groundwork for linguistic creativity in the Chinese new media.Keywords: Chinese neologism, loanword, humor, new media
Procedia PDF Downloads 1742801 Analyzing Emerging Scientific Domains in Biomedical Discourse: Case Study Comparing Microbiome, Metabolome, and Metagenome Research in Scientific Articles
Authors: Kenneth D. Aiello, M. Simeone, Manfred Laubichler
Abstract:
It is increasingly difficult to analyze emerging scientific fields as contemporary scientific fields are more dynamic, their boundaries are more porous, and the relational possibilities have increased due to Big Data and new information sources. In biomedicine, where funding, medical categories, and medical jurisdiction are determined by distinct boundaries on biomedical research fields and definitions of concepts, ambiguity persists between the microbiome, metabolome, and metagenome research fields. This ambiguity continues despite efforts by institutions and organizations to establish parameters on the core concepts and research discourses. Further, the explosive growth of microbiome, metabolome, and metagenomic research has led to unknown variation and covariation making application of findings across subfields or coming to a consensus difficult. This study explores the evolution and variation of knowledge within the microbiome, metabolome, and metagenome research fields related to ambiguous scholarly language and commensurable theoretical frameworks via a semantic analysis of key concepts and narratives. A computational historical framework of cultural evolution and large-scale publication data highlight the boundaries and overlaps between the competing scientific discourses surrounding the three research areas. The results of this study highlight how discourse and language distribute power within scholarly and scientific networks, specifically the power to set and define norms, central questions, methods, and knowledge.Keywords: biomedicine, conceptual change, history of science, philosophy of science, science of science, sociolinguistics, sociology of knowledge
Procedia PDF Downloads 1302800 A Feminist/Queer Global Bioethics’Perspective on Reproduction: Abortion, MAR and Surrogacy
Authors: Tamara Roma, Emma Capulli
Abstract:
Pregnancy and fertility, in other words, reproduction, has become, in the last half of the century, increasingly and globally controlled, medicalized, and regulated. The reflection proposed starts from the consequences of the inscription of reproduction into the neoliberal economic paradigm. The new biotechnologies developments have raised a new patriarchal justification for State’s control of uterus bodies and a new construction of knowledge about reproductive health. Moral discussion and juridification remove reproduction and non-reproduction from their personal and intimate context and frame them under words like “duties”, “rights”, “family planning”, “demography”, and “population policy”, reinvent them as “States business” and ultimately help to re/confirm a specific construct of fertility, motherhood, and family. Moreover, the interaction between the neoliberal economy and medical biotechnologies brought about a new formulation of the connection between feminine generative potential and value production. The widespread and contemporary debates on Medically Assisted Reproduction (MAR), surrogacy and abortion suggest the need for a “feminist/queer global bioethical discourse” capable of inserting itself into the official bioethical debate characterized by the traditional dichotomy of laic bioethics/Catholic bioethics. The contribution moves from a feminist bioethics perspective on reproductive technologies to introduce a feminist/queer global bioethics point of view on reproductive health. The comparison between reproduction and non-reproduction debates is useful to analyze and demonstrate how restrictive legislations, dichotomic bioethical discussion and medical control confirm and strengthens gender injustice in reproductive life. In fact, MAR, surrogacy, and abortion restrictions stem from a shared social and legal paradigm that depends on traditional gender roles revealing how the stratification of reproduction is based on multiple discrimination along the lines of gender, race, and class. In conclusion, the perspective of feminist/queer global bioethics tries to read the concept of universal reproductive justice, introducing an original point of view on reproductive health access.Keywords: queer bioethics, reproductive health, reproductive justice, reproductive technologies
Procedia PDF Downloads 1242799 Applications of Big Data in Education
Authors: Faisal Kalota
Abstract:
Big Data and analytics have gained a huge momentum in recent years. Big Data feeds into the field of Learning Analytics (LA) that may allow academic institutions to better understand the learners’ needs and proactively address them. Hence, it is important to have an understanding of Big Data and its applications. The purpose of this descriptive paper is to provide an overview of Big Data, the technologies used in Big Data, and some of the applications of Big Data in education. Additionally, it discusses some of the concerns related to Big Data and current research trends. While Big Data can provide big benefits, it is important that institutions understand their own needs, infrastructure, resources, and limitation before jumping on the Big Data bandwagon.Keywords: big data, learning analytics, analytics, big data in education, Hadoop
Procedia PDF Downloads 4262798 On the Existence of Homotopic Mapping Between Knowledge Graphs and Graph Embeddings
Authors: Jude K. Safo
Abstract:
Knowledge Graphs KG) and their relation to Graph Embeddings (GE) represent a unique data structure in the landscape of machine learning (relative to image, text and acoustic data). Unlike the latter, GEs are the only data structure sufficient for representing hierarchically dense, semantic information needed for use-cases like supply chain data and protein folding where the search space exceeds the limits traditional search methods (e.g. page-rank, Dijkstra, etc.). While GEs are effective for compressing low rank tensor data, at scale, they begin to introduce a new problem of ’data retreival’ which we observe in Large Language Models. Notable attempts by transE, TransR and other prominent industry standards have shown a peak performance just north of 57% on WN18 and FB15K benchmarks, insufficient practical industry applications. They’re also limited, in scope, to next node/link predictions. Traditional linear methods like Tucker, CP, PARAFAC and CANDECOMP quickly hit memory limits on tensors exceeding 6.4 million nodes. This paper outlines a topological framework for linear mapping between concepts in KG space and GE space that preserve cardinality. Most importantly we introduce a traceable framework for composing dense linguistic strcutures. We demonstrate performance on WN18 benchmark this model hits. This model does not rely on Large Langauge Models (LLM) though the applications are certainy relevant here as well.Keywords: representation theory, large language models, graph embeddings, applied algebraic topology, applied knot theory, combinatorics
Procedia PDF Downloads 682797 Rapid Processing Techniques Applied to Sintered Nickel Battery Technologies for Utility Scale Applications
Authors: J. D. Marinaccio, I. Mabbett, C. Glover, D. Worsley
Abstract:
Through use of novel modern/rapid processing techniques such as screen printing and Near-Infrared (NIR) radiative curing, process time for the sintering of sintered nickel plaques, applicable to alkaline nickel battery chemistries, has been drastically reduced from in excess of 200 minutes with conventional convection methods to below 2 minutes using NIR curing methods. Steps have also been taken to remove the need for forming gas as a reducing agent by implementing carbon as an in-situ reducing agent, within the ink formulation.Keywords: batteries, energy, iron, nickel, storage
Procedia PDF Downloads 4392796 Digital Twin for University Campus: Workflow, Applications and Benefits
Authors: Frederico Fialho Teixeira, Islam Mashaly, Maryam Shafiei, Jurij Karlovsek
Abstract:
The ubiquity of data gathering and smart technologies, advancements in virtual technologies, and the development of the internet of things (IoT) have created urgent demands for the development of frameworks and efficient workflows for data collection, visualisation, and analysis. Digital twin, in different scales of the city into the building, allows for bringing together data from different sources to generate fundamental and illuminating insights for the management of current facilities and the lifecycle of amenities as well as improvement of the performance of current and future designs. Over the past two decades, there has been growing interest in the topic of digital twin and their applications in city and building scales. Most such studies look at the urban environment through a homogeneous or generalist lens and lack specificity in particular characteristics or identities, which define an urban university campus. Bridging this knowledge gap, this paper offers a framework for developing a digital twin for a university campus that, with some modifications, could provide insights for any large-scale digital twin settings like towns and cities. It showcases how currently unused data could be purposefully combined, interpolated and visualised for producing analysis-ready data (such as flood or energy simulations or functional and occupancy maps), highlighting the potential applications of such a framework for campus planning and policymaking. The research integrates campus-level data layers into one spatial information repository and casts light on critical data clusters for the digital twin at the campus level. The paper also seeks to raise insightful and directive questions on how digital twin for campus can be extrapolated to city-scale digital twin. The outcomes of the paper, thus, inform future projects for the development of large-scale digital twin as well as urban and architectural researchers on potential applications of digital twin in future design, management, and sustainable planning, to predict problems, calculate risks, decrease management costs, and improve performance.Keywords: digital twin, smart campus, framework, data collection, point cloud
Procedia PDF Downloads 682795 Efficacy of Clickers in L2 Interaction
Authors: Ryoo Hye Jin Agnes
Abstract:
This study aims to investigate the efficacy of clickers in fostering L2 class interaction. In an L2 classroom, active learner-to-learner interactions and learner-to-teacher interactions play an important role in language acquisition. In light of this, introducing learning tools that promote such interactions would benefit L2 classroom by fostering interaction. This is because the anonymity of clickers allows learners to express their needs without the social risks associated with speaking up in the class. clickers therefore efficiently help learners express their level of understanding during the process of learning itself. This allows for an evaluative feedback loop where both learners and teachers understand the level of progress of the learners, better enabling classrooms to adapt to the learners’ needs. Eventually this tool promotes participation from learners. This, in turn, is believed to be effective in fostering classroom interaction, allowing learning to take place in a more comfortable yet vibrant way. This study is finalized by presenting the result of an experiment conducted to verify the effectiveness of this approach when teaching pragmatic aspect of Korean expressions with similar semantic functions. The learning achievement of learners in the experimental group was found higher than the learners’ in a control group. A survey was distributed to the learners, questioning them regarding the efficacy of clickers, and how it contributed to their learning in areas such as motivation, self-assessment, increasing participation, as well as giving feedback to teachers. Analyzing the data collected from the questionnaire given to the learners, the study presented data suggesting that this approach increased the scope of interactivity in the classroom, thus not only increasing participation but enhancing the type of classroom participation among learners. This participation in turn led to a marked improvement in their communicative abilities.Keywords: second language acquisition, interaction, clickers, learner response system, output from learners, learner’s cognitive process
Procedia PDF Downloads 5212794 Unravelling the Knot: Towards a Definition of ‘Digital Labor’
Authors: Marta D'Onofrio
Abstract:
The debate on the digitalization of the economy has raised questions about how both labor and the regulation of work processes are changing due to the introduction of digital technologies in the productive system. Within the literature, the term ‘digital labor’ is commonly used to identify the impact of digitalization on labor. Despite the wide use of this term, it is still not available an unambiguous definition of it, and this could create confusion in the use of terminology and in the attempts of classification. As a consequence, the purpose of this paper is to provide for a definition and to propose a classification of ‘digital labor’, resorting to the theoretical approach of organizational studies.Keywords: digital labor, digitalization, data-driven algorithms, big data, organizational studies
Procedia PDF Downloads 1532793 An Overview of Posterior Fossa Associated Pathologies and Segmentation
Authors: Samuel J. Ahmad, Michael Zhu, Andrew J. Kobets
Abstract:
Segmentation tools continue to advance, evolving from manual methods to automated contouring technologies utilizing convolutional neural networks. These techniques have evaluated ventricular and hemorrhagic volumes in the past but may be applied in novel ways to assess posterior fossa-associated pathologies such as Chiari malformations. Herein, we summarize literature pertaining to segmentation in the context of this and other posterior fossa-based diseases such as trigeminal neuralgia, hemifacial spasm, and posterior fossa syndrome. A literature search for volumetric analysis of the posterior fossa identified 27 papers where semi-automated, automated, manual segmentation, linear measurement-based formulas, and the Cavalieri estimator were utilized. These studies produced superior data than older methods utilizing formulas for rough volumetric estimations. The most commonly used segmentation technique was semi-automated segmentation (12 studies). Manual segmentation was the second most common technique (7 studies). Automated segmentation techniques (4 studies) and the Cavalieri estimator (3 studies), a point-counting method that uses a grid of points to estimate the volume of a region, were the next most commonly used techniques. The least commonly utilized segmentation technique was linear measurement-based formulas (1 study). Semi-automated segmentation produced accurate, reproducible results. However, it is apparent that there does not exist a single semi-automated software, open source or otherwise, that has been widely applied to the posterior fossa. Fully-automated segmentation via such open source software as FSL and Freesurfer produced highly accurate posterior fossa segmentations. Various forms of segmentation have been used to assess posterior fossa pathologies and each has its advantages and disadvantages. According to our results, semi-automated segmentation is the predominant method. However, atlas-based automated segmentation is an extremely promising method that produces accurate results. Future evolution of segmentation technologies will undoubtedly yield superior results, which may be applied to posterior fossa related pathologies. Medical professionals will save time and effort analyzing large sets of data due to these advances.Keywords: chiari, posterior fossa, segmentation, volumetric
Procedia PDF Downloads 1062792 Commercial Management vs. Quantity Surveying: Hoax or Harmonization
Authors: Zelda Jansen Van Rensburg
Abstract:
Purpose: This study investigates the perceived disparities between Quantity Surveying and Commercial Management in the construction industry, questioning if these differences are substantive or merely semantic. It aims to challenge the conventional notion of Commercial Managers’ superiority by critically evaluating QS and CM roles, exploring CM integration possibilities, examining qualifications for aspiring Commercial Managers, assessing regulatory frameworks, and considering terminology redefinition for global QS professional enhancement. Design: Utilizing mixed methods like literature reviews, surveys, interviews, and document analyses, this research examines the QS-CM relationship. Insights from industry professionals, academics, and regulatory bodies inform the investigation into changing QS roles. Findings: Empirical data highlight evolving roles, showcasing areas of convergence and divergence between QSs and CM. Potential CM integration into QS practice and qualifications for aspiring Commercial Managers are identified. Limitations/Implications: Limitations include potential bias in self-reported data and findings. Nevertheless, the research informs future practices and educational approaches in QS and CM, reflecting the changing roles and responsibilities of Quantity Surveyors. Practical Implications: Findings inform industry practitioners, educators, and regulators, stressing the need to adapt to changing QS roles and integrate CM principles where applicable. Value to the Conference Theme: Aligned with ‘Evolving roles and responsibilities of Quantity Surveyors,’ this research offers insights crucial for understanding the changing dynamics within the QS profession and informs strategies to navigate these shifts effectively.Keywords: quantity surveying, commercial management, cost engineering, quantity survey
Procedia PDF Downloads 402791 Bridging the Digital Divide in India: Issus and Challenges
Authors: Parveen Kumar
Abstract:
The cope the rapid change of technology and to control the ephemeral rate of information generation, librarians along with their professional colleagues need to equip themselves as per the requirement of the electronic information society. E-learning is purely based on computer and communication technologies. The terminologies like computer based learning. It is the delivery of content via all electronic media through internet, internet, Extranets television broadcast, CD-Rom documents, etc. E-learning poses lot of issues in the transformation of literature or knowledge from the conventional medium to ICT based format and web based services.Keywords: e-learning, digital libraries, online learning, electronic information society
Procedia PDF Downloads 5102790 Grounding Chinese Language Vocabulary Teaching and Assessment in the Working Memory Research
Authors: Chan Kwong Tung
Abstract:
Since Baddeley and Hitch’s seminal research in 1974 on working memory (WM), this topic has been of great interest to language educators. Although there are some variations in the definitions of WM, recent findings in WM have contributed vastly to our understanding of language learning, especially its effects on second language acquisition (SLA). For example, the phonological component of WM (PWM) and the executive component of WM (EWM) have been found to be positively correlated with language learning. This paper discusses two general, yet highly relevant WM findings that could directly affect the effectiveness of Chinese Language (CL) vocabulary teaching and learning, as well as the quality of its assessment. First, PWM is found to be critical for the long-term learning of phonological forms of new words. Second, EWM is heavily involved in interpreting the semantic characteristics of new words, which consequently affects the quality of learners’ reading comprehension. These two ideas are hardly discussed in the Chinese literature, both conceptual and empirical. While past vocabulary acquisition studies have mainly focused on the cognitive-processing approach, active processing, ‘elaborate processing’ (or lexical elaboration) and other effective learning tasks and strategies, it is high time to balance the spotlight to the WM (particularly PWM and EWM) to ensure an optimum control on the teaching and learning effectiveness of such approaches, as well as the validity of this language assessment. Given the unique phonological, orthographical and morphological properties of the CL, this discussion will shed some light on the vocabulary acquisition of this Sino-Tibetan language family member. Together, these two WM concepts could have crucial implications for the design, development, and planning of vocabularies and ultimately reading comprehension teaching and assessment in language education. Hopefully, this will raise an awareness and trigger a dialogue about the meaning of these findings for future language teaching, learning, and assessment.Keywords: Chinese Language, working memory, vocabulary assessment, vocabulary teaching
Procedia PDF Downloads 3442789 Mesoporous Na2Ti3O7 Nanotube-Constructed Materials with Hierarchical Architecture: Synthesis and Properties
Authors: Neumoin Anton Ivanovich, Opra Denis Pavlovich
Abstract:
Materials based on titanium oxide compounds are widely used in such areas as solar energy, photocatalysis, food industry and hygiene products, biomedical technologies, etc. Demand for them has also formed in the battery industry (an example of this is the commercialization of Li4Ti5O12), where much attention has recently been paid to the development of next-generation systems and technologies, such as sodium-ion batteries. This dictates the need to search for new materials with improved characteristics, as well as ways to obtain them that meet the requirements of scalability. One of the ways to solve these problems can be the creation of nanomaterials that often have a complex of physicochemical properties that radically differ from the characteristics of their counterparts in the micro- or macroscopic state. At the same time, it is important to control the texture (specific surface area, porosity) of such materials. In view of the above, among other methods, the hydrothermal technique seems to be suitable, allowing a wide range of control over the conditions of synthesis. In the present study, a method was developed for the preparation of mesoporous nanostructured sodium trititanate (Na2Ti3O7) with a hierarchical architecture. The materials were synthesized by hydrothermal processing and exhibit a complex hierarchically organized two-layer architecture. At the first level of the hierarchy, materials are represented by particles having a roughness surface, and at the second level, by one-dimensional nanotubes. The products were found to have high specific surface area and porosity with a narrow pore size distribution (about 6 nm). As it is known, the specific surface area and porosity are important characteristics of functional materials, which largely determine the possibilities and directions of their practical application. Electrochemical impedance spectroscopy data show that the resulting sodium trititanate has a sufficiently high electrical conductivity. As expected, the synthesized complexly organized nanoarchitecture based on sodium trititanate with a porous structure can be practically in demand, for example, in the field of new generation electrochemical storage and energy conversion devices.Keywords: sodium trititanate, hierarchical materials, mesoporosity, nanotubes, hydrothermal synthesis
Procedia PDF Downloads 1072788 Developing a SOA-Based E-Healthcare Systems
Authors: Hend Albassam, Nouf Alrumaih
Abstract:
Nowadays we are in the age of technologies and communication and there is no doubt that technologies such as the Internet can offer many advantages for many business fields, and the health field is no execution. In fact, using the Internet provide us with a new path to improve the quality of health care throughout the world. The e-healthcare offers many advantages such as: efficiency by reducing the cost and avoiding duplicate diagnostics, empowerment of patients by enabling them to access their medical records, enhancing the quality of healthcare and enabling information exchange and communication between healthcare organizations. There are many problems that result from using papers as a way of communication, for example, paper-based prescriptions. Usually, the doctor writes a prescription and gives it to the patient who in turn carries it to the pharmacy. After that, the pharmacist takes the prescription to fill it and give it to the patient. Sometimes the pharmacist might find difficulty in reading the doctor’s handwriting; the patient could change and counterfeit the prescription. These existing problems and many others heighten the need to improve the quality of the healthcare. This project is set out to develop a distributed e-healthcare system that offers some features of e-health and addresses some of the above-mentioned problems. The developed system provides an electronic health record (EHR) and enables communication between separate health care organizations such as the clinic, pharmacy and laboratory. To develop this system, the Service Oriented Architecture (SOA) is adopted as a design approach, which helps to design several independent modules that communicate by using web services. The layering design pattern is used in designing each module as it provides reusability that allows the business logic layer to be reused by different higher layers such as the web service or the website in our system. The experimental analysis has shown that the project has successfully achieved its aims toward solving the problems related to the paper-based healthcare systems and it enables different health organization to communicate effectively. It implements four independent modules including healthcare provider, pharmacy, laboratory and medication information provider. Each module provides different functionalities and is used by a different type of user. These modules interoperate with each other using a set of web services.Keywords: e-health, services oriented architecture (SOA), web services, interoperability
Procedia PDF Downloads 3042787 Clean Coal Using Coal Bed Methane: A Pollution Control Mechanism
Authors: Arish Iqbal, Santosh Kumar Singh
Abstract:
Energy from coal is one of the major source of energy throughout the world but taking into consideration its effect on environment 'Clean Coal Technologies' (CCT) came into existence. In this paper we have we studied why CCT’s are essential and what are the different types of CCT’s. Also, the coal and CCT scenario in India is introduced. Coal Bed Methane one of major CCT area is studied in detail. Different types of coal bed methane and its methods of extraction are discussed. The different problem areas during the extraction of CBM are identified and discussed. How CBM can be used as a fuel for future is also discussed.Keywords: CBM (coal bed methane), CCS (carbon capture and storage), CCT (clean coal technology), CMM (coal mining methane)
Procedia PDF Downloads 2402786 The U.S. Missile Defense Shield and Global Security Destabilization: An Inconclusive Link
Authors: Michael A. Unbehauen, Gregory D. Sloan, Alberto J. Squatrito
Abstract:
Missile proliferation and global stability are intrinsically linked. Missile threats continually appear at the forefront of global security issues. North Korea’s recently demonstrated nuclear and intercontinental ballistic missile (ICBM) capabilities, for the first time since the Cold War, renewed public interest in strategic missile defense capabilities. To protect from limited ICBM attacks from so-called rogue actors, the United States developed the Ground-based Midcourse Defense (GMD) system. This study examines if the GMD missile defense shield has contributed to a safer world or triggered a new arms race. Based upon increased missile-related developments and the lack of adherence to international missile treaties, it is generally perceived that the GMD system is a destabilizing factor for global security. By examining the current state of arms control treaties as well as existing missile arsenals and ongoing efforts in technologies to overcome U.S. missile defenses, this study seeks to analyze the contribution of GMD to global stability. A thorough investigation cannot ignore that, through the establishment of this limited capability, the U.S. violated longstanding, successful weapons treaties and caused concern among states that possess ICBMs. GMD capability contributes to the perception that ICBM arsenals could become ineffective, creating an imbalance in favor of the United States, leading to increased global instability and tension. While blame for the deterioration of global stability and non-adherence to arms control treaties is often placed on U.S. missile defense, the facts do not necessarily support this view. The notion of a renewed arms race due to GMD is supported neither by current missile arsenals nor by the inevitable development of new and enhanced missile technology, to include multiple independently targeted reentry vehicles (MIRVs), maneuverable reentry vehicles (MaRVs), and hypersonic glide vehicles (HGVs). The methodology in this study encapsulates a period of time, pre- and post-GMD introduction, while analyzing international treaty adherence, missile counts and types, and research in new missile technologies. The decline in international treaty adherence, coupled with a measurable increase in the number and types of missiles or research in new missile technologies during the period after the introduction of GMD, could be perceived as a clear indicator of GMD contributing to global instability. However, research into improved technology (MIRV, MaRV and HGV) prior to GMD, as well as a decline of various global missile inventories and testing of systems during this same period, would seem to invalidate this theory. U.S. adversaries have exploited the perception of the U.S. missile defense shield as a destabilizing factor as a pretext to strengthen and modernize their militaries and justify their policies. As a result, it can be concluded that global stability has not significantly decreased due to GMD; but rather, the natural progression of technological and missile development would inherently include innovative and dynamic approaches to target engagement, deterrence, and national defense.Keywords: arms control, arms race, global security, GMD, ICBM, missile defense, proliferation
Procedia PDF Downloads 143