Search results for: privacy enhancing technologies
1293 The Effect of Alternative Organic Fertilizer and Chemical Fertilizer on Nitrogen and Yield of Peppermint (Mentha peperita)
Authors: Seyed Ali Mohammad, Modarres Sanavy, Hamed Keshavarz, Ali Mokhtassi-Bidgoli
Abstract:
One of the biggest challenges for the current and future generations is to produce sufficient food for the world population with the existing limited available water resources. Peppermint is a specialty crop used for food and medicinal purposes. Its main component is menthol. It is used predominantly for oral hygiene, pharmaceuticals, and foods. Although drought stress is considered as a negative factor in agriculture, being responsible for severe yield losses; medicinal plants grown under semi-arid conditions usually produce higher concentrations of active substances than same species grown under moderate climates. Nitrogen (N) fertilizer management is central to the profitability and sustainability of forage crop production. Sub-optimal N supply will result in poor yields, and excess N application can lead to nitrate leaching and environmental pollution. In order to determine the response of peppermint to drought stress and different fertilizer treatments, a field experiment with peppermint was conducted in a sandy loam soil at a site of the Tarbiat Modares University, Agriculture Faculty, Tehran, Iran. The experiment used a complete randomized block design, with six rates of fertilizer strategies (F1: control, F2: Urea, F3: 75% urea + 25% vermicompost, F4: 50% urea + 50% vermicompost, F5: 25% urea + 75% vermicompost and F6: vermicompost) and three irrigation regime (S1: 45%, S2: 60% and S3: 75% FC) with three replication. The traits such as nitrogen, chlorophyll, carotenoids, anthocyanin, flavonoid and fresh biomass were studied. The results showed that the treatments had a significant effect on the studied traits as drought stress reduced photosynthetic pigment concentration. Also, drought stress reduced fresh yield of peppermint. Non stress condition had the greater amount of chlorophyll and fresh yield more than other irrigation treatments. The highest concentration of chlorophyll and the fresh biomass was obtained in F2 fertilizing treatments. Sever water stress (S1) produced decreased photosynthetic pigment content fresh yield of peppermint. Supply of N could improve photosynthetic capacity by enhancing photosynthetic pigment content. Perhaps application of vermicompost significantly improved the organic carbon, available N, P and K content in soil over urea fertilization alone. To get sustainable production of peppermint, application of vermicompost along with N through synthetic fertilizer is recommended for light textured sandy loam soils.Keywords: fresh yield, peppermint, synthetic nitrogen, vermicompost, water stress
Procedia PDF Downloads 2171292 Iot-Based Interactive Patient Identification and Safety Management System
Authors: Jonghoon Chun, Insung Kim, Jonghyun Lim, Gun Ro
Abstract:
We believe that it is possible to provide a solution to reduce patient safety accidents by displaying correct medical records and prescription information through interactive patient identification. Our system is based on the use of smart bands worn by patients and these bands communicate with the hybrid gateways which understand both BLE and Wifi communication protocols. Through the convergence of low-power Bluetooth (BLE) and hybrid gateway technology, which is one of short-range wireless communication technologies, we implement ‘Intelligent Patient Identification and Location Tracking System’ to prevent medical malfunction frequently occurring in medical institutions. Based on big data and IOT technology using MongoDB, smart band (BLE, NFC function) and hybrid gateway, we develop a system to enable two-way communication between medical staff and hospitalized patients as well as to store locational information of the patients in minutes. Based on the precise information provided using big data systems, such as location tracking and movement of in-hospital patients wearing smart bands, our findings include the fact that a patient-specific location tracking algorithm can more efficiently operate HIS (Hospital Information System) and other related systems. Through the system, we can always correctly identify patients using identification tags. In addition, the system automatically determines whether the patient is a scheduled for medical service by the system in use at the medical institution, and displays the appropriateness of the medical treatment and the medical information (medical record and prescription information) on the screen and voice. This work was supported in part by the Korea Technology and Information Promotion Agency for SMEs (TIPA) grant funded by the Korean Small and Medium Business Administration (No. S2410390).Keywords: BLE, hybrid gateway, patient identification, IoT, safety management, smart band
Procedia PDF Downloads 3111291 Advances in Genome Editing and Future Prospects for Sorghum Improvement: A Review
Authors: Micheale Yifter Weldemichael, Hailay Mehari Gebremedhn, Teklehaimanot Hailesslasie Teklu
Abstract:
Recent developments in targeted genome editing accelerated genetic research and opened new potentials to improve crops for better yields and quality. Given the significance of cereal crops as a primary source of food for the global population, the utilization of contemporary genome editing techniques like CRISPR/Cas9 is timely and crucial. CRISPR/Cas technology has enabled targeted genomic modifications, revolutionizing genetic research and exploration. Application of gene editing through CRISPR/Cas9 in enhancing sorghum is particularly vital given the current ecological, environmental, and agricultural challenges exacerbated by climate change. As sorghum is one of the main staple foods of our region and is known to be a resilient crop with a high potential to overcome the above challenges, the application of genome editing technology will enhance the investigation of gene functionality. CRISPR/Cas9 enables the improvement of desirable sorghum traits, including nutritional value, yield, resistance to pests and diseases, and tolerance to various abiotic stresses. Furthermore, CRISPR/Cas9 has the potential to perform intricate editing and reshape the existing elite sorghum varieties, and introduce new genetic variations. However, current research primarily focuses on improving the efficacy of the CRISPR/Cas9 system in successfully editing endogenous sorghum genes, making it a feasible and successful undertaking in sorghum improvement. Recent advancements and developments in CRISPR/Cas9 techniques have further empowered researchers to modify additional genes in sorghum with greater efficiency. Successful application and advancement of CRISPR techniques in sorghum will aid not only in gene discovery and the creation of novel traits that regulate gene expression and functional genomics but also in facilitating site-specific integration events. The purpose of this review is, therefore, to elucidate the current advances in sorghum genome editing and highlight its potential in addressing food security issues. It also assesses the efficiency of CRISPR-mediated improvement and its long-term effects on crop improvement and host resistance against parasites, including tissue-specific activity and the ability to induce resistance. This review ends by emphasizing the challenges and opportunities of CRISPR technology in combating parasitic plants and proposing directions for future research to safeguard global agricultural productivity.Keywords: CRISPR/Cas9, genome editing, quality, sorghum, stress, yield
Procedia PDF Downloads 381290 Smart Contracts: Bridging the Divide Between Code and Law
Authors: Abeeb Abiodun Bakare
Abstract:
The advent of blockchain technology has birthed a revolutionary innovation: smart contracts. These self-executing contracts, encoded within the immutable ledger of a blockchain, hold the potential to transform the landscape of traditional contractual agreements. This research paper embarks on a comprehensive exploration of the legal implications surrounding smart contracts, delving into their enforceability and their profound impact on traditional contract law. The first section of this paper delves into the foundational principles of smart contracts, elucidating their underlying mechanisms and technological intricacies. By harnessing the power of blockchain technology, smart contracts automate the execution of contractual terms, eliminating the need for intermediaries and enhancing efficiency in commercial transactions. However, this technological marvel raises fundamental questions regarding legal enforceability and compliance with traditional legal frameworks. Moving beyond the realm of technology, the paper proceeds to analyze the legal validity of smart contracts within the context of traditional contract law. Drawing upon established legal principles, such as offer, acceptance, and consideration, we examine the extent to which smart contracts satisfy the requirements for forming a legally binding agreement. Furthermore, we explore the challenges posed by jurisdictional issues as smart contracts transcend physical boundaries and operate within a decentralized network. Central to this analysis is the examination of the role of arbitration and dispute resolution mechanisms in the context of smart contracts. While smart contracts offer unparalleled efficiency and transparency in executing contractual terms, disputes inevitably arise, necessitating mechanisms for resolution. We investigate the feasibility of integrating arbitration clauses within smart contracts, exploring the potential for decentralized arbitration platforms to streamline dispute resolution processes. Moreover, this paper explores the implications of smart contracts for traditional legal intermediaries, such as lawyers and judges. As smart contracts automate the execution of contractual terms, the role of legal professionals in contract drafting and interpretation may undergo significant transformation. We assess the implications of this paradigm shift for legal practice and the broader legal profession. In conclusion, this research paper provides a comprehensive analysis of the legal implications surrounding smart contracts, illuminating the intricate interplay between code and law. While smart contracts offer unprecedented efficiency and transparency in commercial transactions, their legal validity remains subject to scrutiny within traditional legal frameworks. By navigating the complex landscape of smart contract law, we aim to provide insights into the transformative potential of this groundbreaking technology.Keywords: smart-contracts, law, blockchain, legal, technology
Procedia PDF Downloads 451289 Smart Technology Work Practices to Minimize Job Pressure
Authors: Babar Rasheed
Abstract:
The organizations are in continuous effort to increase their yield and to retain their associates, employees. Technology is considered an integral part of attaining apposite work practices, work environment, and employee engagement. Unconsciously, these advanced practices like work from home, personalized intra-network are disturbing employee work-life balance which ultimately increases psychological pressure on employees. The smart work practice is to develop business models and organizational practices with enhanced employee engagement, minimum trouncing of organization resources with persistent revenue and positive addition in global societies. Need of smart work practices comes from increasing employee turnover rate, global economic recession, unnecessary job pressure, increasing contingent workforce and advancement in technologies. Current practices are not enough elastic to tackle global changing work environment and organizational competitions. Current practices are causing many reciprocal problems among employee and organization mechanically. There is conscious understanding among business sectors smart work practices that will deal with new century challenges with addressing the concerns of relevant issues. It is aimed in this paper to endorse customized and smart work practice tools along knowledge framework to manage the growing concerns of employee engagement, use of technology, orgaization concerns and challenges for the business. This includes a Smart Management Information System to address necessary concerns of employees and combine with a framework to extract the best possible ways to allocate companies resources and re-align only required efforts to adopt the best possible strategy for controlling potential risks.Keywords: employees engagement, management information system, psychological pressure, current and future HR practices
Procedia PDF Downloads 1841288 Unraveling Language Contact through Syntactic Dynamics of ‘Also’ in Hong Kong and Britain English
Authors: Xu Zhang
Abstract:
This article unveils an indicator of language contact between English and Cantonese in one of the Outer Circle Englishes, Hong Kong (HK) English, through an empirical investigation into 1000 tokens from the Global Web-based English (GloWbE) corpus, employing frequency analysis and logistic regression analysis. It is perceived that Cantonese and general Chinese are contextually marked by an integral underlying thinking pattern. Chinese speakers exhibit a reliance on semantic context over syntactic rules and lexical forms. This linguistic trait carries over to their use of English, affording greater flexibility to formal elements in constructing English sentences. The study focuses on the syntactic positioning of the focusing subjunct ‘also’, a linguistic element used to add new or contrasting prominence to specific sentence constituents. The English language generally allows flexibility in the relative position of 'also’, while there is a preference for close marking relationships. This article shifts attention to Hong Kong, where Cantonese and English converge, and 'also' finds counterparts in Cantonese ‘jaa’ and Mandarin ‘ye’. Employing a corpus-based data-driven method, we investigate the syntactic position of 'also' in both HK and GB English. The study aims to ascertain whether HK English exhibits a greater 'syntactic freedom,' allowing for a more distant marking relationship with 'also' compared to GB English. The analysis involves a random extraction of 500 samples from both HK and GB English from the GloWbE corpus, forming a dataset (N=1000). Exclusions are made for cases where 'also' functions as an additive conjunct or serves as a copulative adverb, as well as sentences lacking sufficient indication that 'also' functions as a focusing particle. The final dataset comprises 820 tokens, with 416 for GB and 404 for HK, annotated according to the focused constituent and the relative position of ‘also’. Frequency analysis reveals significant differences in the relative position of 'also' and marking relationships between HK and GB English. Regression analysis indicates a preference in HK English for a distant marking relationship between 'also' and its focused constituent. Notably, the subject and other constituents emerge as significant predictors of a distant position for 'also.' Together, these findings underscore the nuanced linguistic dynamics in HK English and contribute to our understanding of language contact. It suggests that future pedagogical practice should consider incorporating the syntactic variation within English varieties, facilitating leaners’ effective communication in diverse English-speaking environments and enhancing their intercultural communication competence.Keywords: also, Cantonese, English, focus marker, frequency analysis, language contact, logistic regression analysis
Procedia PDF Downloads 551287 Non-Invasive Data Extraction from Machine Display Units Using Video Analytics
Authors: Ravneet Kaur, Joydeep Acharya, Sudhanshu Gaur
Abstract:
Artificial Intelligence (AI) has the potential to transform manufacturing by improving shop floor processes such as production, maintenance and quality. However, industrial datasets are notoriously difficult to extract in a real-time, streaming fashion thus, negating potential AI benefits. The main example is some specialized industrial controllers that are operated by custom software which complicates the process of connecting them to an Information Technology (IT) based data acquisition network. Security concerns may also limit direct physical access to these controllers for data acquisition. To connect the Operational Technology (OT) data stored in these controllers to an AI application in a secure, reliable and available way, we propose a novel Industrial IoT (IIoT) solution in this paper. In this solution, we demonstrate how video cameras can be installed in a factory shop floor to continuously obtain images of the controller HMIs. We propose image pre-processing to segment the HMI into regions of streaming data and regions of fixed meta-data. We then evaluate the performance of multiple Optical Character Recognition (OCR) technologies such as Tesseract and Google vision to recognize the streaming data and test it for typical factory HMIs and realistic lighting conditions. Finally, we use the meta-data to match the OCR output with the temporal, domain-dependent context of the data to improve the accuracy of the output. Our IIoT solution enables reliable and efficient data extraction which will improve the performance of subsequent AI applications.Keywords: human machine interface, industrial internet of things, internet of things, optical character recognition, video analytics
Procedia PDF Downloads 1091286 Analysis of Organizational Hybrid Agile Methods Environments: Frameworks, Benefits, and Challenges
Authors: Majid Alsubaie, Hamed Sarbazhosseini
Abstract:
Many working environments have experienced increased uncertainty due to the fast-moving and unpredictable world. IT systems development projects, in particular, face several challenges because of their rapidly changing environments and emerging technologies. Information technology organizations within these contexts adapt systems development methodology and new software approaches to address this issue. One of these methodologies is the Agile method, which has gained huge attention in recent years. However, due to failure rates in IT projects, there is an increasing demand for the use of hybrid Agile methods among organizations. The scarce research in the area means that organizations do not have solid evidence-based knowledge for the use of hybrid Agile. This research was designed to provide further insights into the development of hybrid Agile methods within systems development projects, including how frameworks and processes are used and what benefits and challenges are gained and faced as a result of hybrid Agile methods. This paper presents how three organizations (two government and one private) use hybrid Agile methods in their Agile environments. The data was collected through interviews and a review of relevant documents. The results indicate that these organizations do not predominantly use pure Agile. Instead, they are waterfall organizations by virtue of systems nature and complexity, and Agile is used underneath as the delivery model. Prince2 Agile framework, SAFe, Scrum, and Kanban were the identified models and frameworks followed. This study also found that customer satisfaction and the ability to build quickly are the most frequently perceived benefits of using hybrid Agile methods. In addition, team resistance and scope changes are the common challenges identified by research participants in their working environments. The findings can help to understand Agile environmental conditions and projects that can help get better success rates and customer satisfaction.Keywords: agile, hybrid, IT systems, management, success rate, technology
Procedia PDF Downloads 1081285 Enhancing Large Language Models' Data Analysis Capability with Planning-and-Execution and Code Generation Agents: A Use Case for Southeast Asia Real Estate Market Analytics
Authors: Kien Vu, Jien Min Soh, Mohamed Jahangir Abubacker, Piyawut Pattamanon, Soojin Lee, Suvro Banerjee
Abstract:
Recent advances in Generative Artificial Intelligence (GenAI), in particular Large Language Models (LLMs) have shown promise to disrupt multiple industries at scale. However, LLMs also present unique challenges, notably, these so-called "hallucination" which is the generation of outputs that are not grounded in the input data that hinders its adoption into production. Common practice to mitigate hallucination problem is utilizing Retrieval Agmented Generation (RAG) system to ground LLMs'response to ground truth. RAG converts the grounding documents into embeddings, retrieve the relevant parts with vector similarity between user's query and documents, then generates a response that is not only based on its pre-trained knowledge but also on the specific information from the retrieved documents. However, the RAG system is not suitable for tabular data and subsequent data analysis tasks due to multiple reasons such as information loss, data format, and retrieval mechanism. In this study, we have explored a novel methodology that combines planning-and-execution and code generation agents to enhance LLMs' data analysis capabilities. The approach enables LLMs to autonomously dissect a complex analytical task into simpler sub-tasks and requirements, then convert them into executable segments of code. In the final step, it generates the complete response from output of the executed code. When deployed beta version on DataSense, the property insight tool of PropertyGuru, the approach yielded promising results, as it was able to provide market insights and data visualization needs with high accuracy and extensive coverage by abstracting the complexities for real-estate agents and developers from non-programming background. In essence, the methodology not only refines the analytical process but also serves as a strategic tool for real estate professionals, aiding in market understanding and enhancement without the need for programming skills. The implication extends beyond immediate analytics, paving the way for a new era in the real estate industry characterized by efficiency and advanced data utilization.Keywords: large language model, reasoning, planning and execution, code generation, natural language processing, prompt engineering, data analysis, real estate, data sense, PropertyGuru
Procedia PDF Downloads 871284 The Application of Participatory Social Media in Collaborative Planning: A Systematic Review
Authors: Yujie Chen , Zhen Li
Abstract:
In the context of planning transformation, how to promote public participation in the formulation and implementation of collaborative planning has been the focused issue of discussion. However, existing studies have often been case-specific or focused on a specific design field, leaving the role of participatory social media (PSM) in urban collaborative planning generally questioned. A systematic database search was conducted in December 2019. Articles and projects were eligible if they reported a quantitative empirical study applying participatory social media in the collaborative planning process (a prospective, retrospective, experimental, longitudinal research, or collective actions in planning practices). Twenty studies and seven projects were included in the review. Findings showed that social media are generally applied in public spatial behavior, transportation behavior, and community planning fields, with new technologies and new datasets. PSM has provided a new platform for participatory design, decision analysis, and collaborative negotiation most widely used in participatory design. Findings extracted several existing forms of PSM. PSM mainly act as three roles: the language of decision-making for communication, study mode for spatial evaluation, and decision agenda for interactive decision support. Three optimization content of PSM were recognized, including improving participatory scale, improvement of the grass-root organization, and promotion of politics. However, basically, participants only could provide information and comment through PSM in the future collaborative planning process, therefore the issues of low data response rate, poor spatial data quality, and participation sustainability issues worth more attention and solutions.Keywords: participatory social media, collaborative planning, planning workshop, application mode
Procedia PDF Downloads 1331283 The Role of Heat Pumps in the Decarbonization of European Regions
Authors: Domenico M. Mongelli, Michele De Carli, Laura Carnieletto, Filippo Busato
Abstract:
Europe's dependence on imported fossil fuels has been particularly highlighted by the Russian invasion of Ukraine. Limiting this dependency with a massive replacement of fossil fuel boilers with heat pumps for building heating is the goal of this work. Therefore, with the aim of diversifying energy sources and evaluating the potential use of heat pump technologies for residential buildings with a view to decarbonization, the quantitative reduction in the consumption of fossil fuels was investigated in all regions of Europe through the use of heat pumps. First, a general overview of energy consumption in buildings in Europe has been assessed. The consumption of buildings has been addressed to the different uses (heating, cooling, DHW, etc.) as well as the different sources (natural gas, oil, biomass, etc.). The analysis has been done in order to provide a baseline at the European level on the current consumptions and future consumptions, with a particular interest in the future increase of cooling. A database was therefore created on the distribution of residential energy consumption linked to air conditioning among the various energy carriers (electricity, waste heat, gas, solid fossil fuels, liquid fossil fuels, and renewable sources) for each region in Europe. Subsequently, the energy profiles of various European cities representative of the different climates are analyzed in order to evaluate, in each European climatic region, which energy coverage can be provided by heat pumps in replacement of natural gas and solid and liquid fossil fuels for air conditioning of the buildings, also carrying out the environmental and economic assessments for this energy transition operation. This work aims to make an innovative contribution to the evaluation of the potential for introducing heat pump technology for decarbonization in the air conditioning of buildings in all climates of the different European regions.Keywords: heat pumps, heating, decarbonization, energy policies
Procedia PDF Downloads 1291282 Unlocking the Genetic Code: Exploring the Potential of DNA Barcoding for Biodiversity Assessment
Authors: Mohammed Ahmed Ahmed Odah
Abstract:
DNA barcoding is a crucial method for assessing and monitoring species diversity amidst escalating threats to global biodiversity. The author explores DNA barcoding's potential as a robust and reliable tool for biodiversity assessment. It begins with a comprehensive review of existing literature, delving into the theoretical foundations, methodologies and applications of DNA barcoding. The suitability of various DNA regions, like the COI gene, as universal barcodes is extensively investigated. Additionally, the advantages and limitations of different DNA sequencing technologies and bioinformatics tools are evaluated within the context of DNA barcoding. To evaluate the efficacy of DNA barcoding, diverse ecosystems, including terrestrial, freshwater and marine habitats, are sampled. Extracted DNA from collected specimens undergoes amplification and sequencing of the target barcode region. Comparison of the obtained DNA sequences with reference databases allows for the identification and classification of the sampled organisms. Findings demonstrate that DNA barcoding accurately identifies species, even in cases where morphological identification proves challenging. Moreover, it sheds light on cryptic and endangered species, aiding conservation efforts. The author also investigates patterns of genetic diversity and evolutionary relationships among different taxa through the analysis of genetic data. This research contributes to the growing knowledge of DNA barcoding and its applicability for biodiversity assessment. The advantages of this approach, such as speed, accuracy and cost-effectiveness, are highlighted, along with areas for improvement. By unlocking the genetic code, DNA barcoding enhances our understanding of biodiversity, supports conservation initiatives and informs evidence-based decision-making for the sustainable management of ecosystems.Keywords: DNA barcoding, biodiversity assessment, genetic code, species identification, taxonomic resolution, next-generation sequencing
Procedia PDF Downloads 241281 The Quest for Institutional Independence to Advance Police Pluralism in Ethiopia
Authors: Demelash Kassaye Debalkie
Abstract:
The primary objective of this study is to report the tributes that are significantly impeding the Ethiopian police's ability to provide quality services to the people. Policing in Ethiopia started in the medieval period. However, modern policing was introduced instead of vigilantism in the early 1940s. The progress counted since the date police became modernized is, however, under contention when viewed from the standpoint of officers’ development and technologies in the 21st century. The police in Ethiopia are suffering a lot to be set free from any form of political interference by the government and to be loyal to impartiality, equity, and justice in enforcing the law. Moreover, the institutional competence of the police in Ethiopia is currently losing its power derived from the constitution as a legitimate enforcement agency due to the country’s political landscape encouraging ethnic-based politics. According to studies, the impact of ethnic politics has been a significant challenge for police in controlling conflicts between two ethnic groups. The study used qualitative techniques and data was gathered from key informants selected purposely. The findings indicate that governments in the past decades were skeptical about establishing a constitutional police force in the country. This has certainly been one of the challenges of pluralizing the police: building police-community relations based on trust. The study conducted to uncover the obstructions has finally reported that the government’s commitment to form a non-partisan, functionally decentralized, and operationally demilitarized police force is too minimal and appalling. They mainly intend to formulate the missions of the police in accordance with their interests and political will to remain in power. It, therefore, reminds the policymakers, law enforcement officials, and the government in power to revise its policies and working procedures already operational to strengthen the police in Ethiopia based on public participation and engagement.Keywords: community, constitution, Ethiopia, law enforcement
Procedia PDF Downloads 861280 Meanings and Concepts of Standardization in Systems Medicine
Authors: Imme Petersen, Wiebke Sick, Regine Kollek
Abstract:
In systems medicine, high-throughput technologies produce large amounts of data on different biological and pathological processes, including (disturbed) gene expressions, metabolic pathways and signaling. The large volume of data of different types, stored in separate databases and often located at different geographical sites have posed new challenges regarding data handling and processing. Tools based on bioinformatics have been developed to resolve the upcoming problems of systematizing, standardizing and integrating the various data. However, the heterogeneity of data gathered at different levels of biological complexity is still a major challenge in data analysis. To build multilayer disease modules, large and heterogeneous data of disease-related information (e.g., genotype, phenotype, environmental factors) are correlated. Therefore, a great deal of attention in systems medicine has been put on data standardization, primarily to retrieve and combine large, heterogeneous datasets into standardized and incorporated forms and structures. However, this data-centred concept of standardization in systems medicine is contrary to the debate in science and technology studies (STS) on standardization that rather emphasizes the dynamics, contexts and negotiations of standard operating procedures. Based on empirical work on research consortia that explore the molecular profile of diseases to establish systems medical approaches in the clinic in Germany, we trace how standardized data are processed and shaped by bioinformatics tools, how scientists using such data in research perceive such standard operating procedures and which consequences for knowledge production (e.g. modeling) arise from it. Hence, different concepts and meanings of standardization are explored to get a deeper insight into standard operating procedures not only in systems medicine, but also beyond.Keywords: data, science and technology studies (STS), standardization, systems medicine
Procedia PDF Downloads 3411279 Modelling the Antecedents of Supply Chain Enablers in Online Groceries Using Interpretive Structural Modelling and MICMAC Analysis
Authors: Rose Antony, Vivekanand B. Khanapuri, Karuna Jain
Abstract:
Online groceries have transformed the way the supply chains are managed. These are facing numerous challenges in terms of product wastages, low margins, long breakeven to achieve and low market penetration to mention a few. The e-grocery chains need to overcome these challenges in order to survive the competition. The purpose of this paper is to carry out a structural analysis of the enablers in e-grocery chains by applying Interpretive Structural Modeling (ISM) and MICMAC analysis in the Indian context. The research design is descriptive-explanatory in nature. The enablers have been identified from the literature and through semi-structured interviews conducted among the managers having relevant experience in e-grocery supply chains. The experts have been contacted through professional/social networks by adopting a purposive snowball sampling technique. The interviews have been transcribed, and manual coding is carried using open and axial coding method. The key enablers are categorized into themes, and the contextual relationship between these and the performance measures is sought from the Industry veterans. Using ISM, the hierarchical model of the enablers is developed and MICMAC analysis identifies the driver and dependence powers. Based on the driver-dependence power the enablers are categorized into four clusters namely independent, autonomous, dependent and linkage. The analysis found that information technology (IT) and manpower training acts as key enablers towards reducing the lead time and enhancing the online service quality. Many of the enablers fall under the linkage cluster viz., frequent software updating, branding, the number of delivery boys, order processing, benchmarking, product freshness and customized applications for different stakeholders, depicting these as critical in online food/grocery supply chains. Considering the perishability nature of the product being handled, the impact of the enablers on the product quality is also identified. Hence, study aids as a tool to identify and prioritize the vital enablers in the e-grocery supply chain. The work is perhaps unique, which identifies the complex relationships among the supply chain enablers in fresh food for e-groceries and linking them to the performance measures. It contributes to the knowledge of supply chain management in general and e-retailing in particular. The approach focus on the fresh food supply chains in the Indian context and hence will be applicable in developing economies context, where supply chains are evolving.Keywords: interpretive structural modelling (ISM), India, online grocery, retail operations, supply chain management
Procedia PDF Downloads 2041278 Cumulative Pressure Hotspot Assessment in the Red Sea and Arabian Gulf
Authors: Schröde C., Rodriguez D., Sánchez A., Abdul Malak, Churchill J., Boksmati T., Alharbi, Alsulmi H., Maghrabi S., Mowalad, Mutwalli R., Abualnaja Y.
Abstract:
Formulating a strategy for sustainable development of the Kingdom of Saudi Arabia’s coastal and marine environment is at the core of the “Marine and Coastal Protection Assessment Study for the Kingdom of Saudi Arabia Coastline (MCEP)”; that was set up in the context of the Vision 2030 by the Saudi Arabian government and aimed at providing a first comprehensive ‘Status Quo Assessment’ of the Kingdom’s marine environment to inform a sustainable development strategy and serve as a baseline assessment for future monitoring activities. This baseline assessment relied on scientific evidence of the drivers, pressures and their impact on the environments of the Red Sea and Arabian Gulf. A key element of the assessment was the cumulative pressure hotspot analysis developed for both national waters of the Kingdom following the principles of the Driver-Pressure-State-Impact-Response (DPSIR) framework and using the cumulative pressure and impact assessment methodology. The ultimate goals of the analysis were to map and assess the main hotspots of environmental pressures, and identify priority areas for further field surveillance and for urgent management actions. The study identified maritime transport, fisheries, aquaculture, oil, gas, energy, coastal industry, coastal and maritime tourism, and urban development as the main drivers of pollution in the Saudi Arabian marine waters. For each of these drivers, pressure indicators were defined to spatially assess the potential influence of the drivers on the coastal and marine environment. A list of hotspots of 90 locations could be identified based on the assessment. Spatially grouped the list could be reduced to come up with of 10 hotspot areas, two in the Arabian Gulf, 8 in the Red Sea. The hotspot mapping revealed clear spatial patterns of drivers, pressures and hotspots within the marine environment of waters under KSA’s maritime jurisdiction in the Red Sea and Arabian Gulf. The cascading assessment approach based on the DPSIR framework ensured that the root causes of the hotspot patterns, i.e. the human activities and other drivers, can be identified. The adapted CPIA methodology allowed for the combination of the available data to spatially assess the cumulative pressure in a consistent manner, and to identify the most critical hotspots by determining the overlap of cumulative pressure with areas of sensitive biodiversity. Further improvements are expected by enhancing the data sources of drivers and pressure indicators, fine-tuning the decay factors and distances of the pressure indicators, as well as including trans-boundary pressures across the regional seas.Keywords: Arabian Gulf, DPSIR, hotspot, red sea
Procedia PDF Downloads 1401277 Development of Agomelatine Loaded Proliposomal Powders for Improved Intestinal Permeation: Effect of Surface Charge
Authors: Rajasekhar Reddy Poonuru, Anusha Parnem
Abstract:
Purpose: To formulate proliposome powder of agomelatine, an antipsychotic drug, and to evaluate physicochemical, in vitro characters and effect of surface charge on ex vivo intestinal permeation. Methods: Film deposition technique was employed to develop proliposomal powders of agomelatin with varying molar ratios of lipid Hydro Soy PC L-α-phosphatidylcholine (HSPC) and cholesterol with fixed sum of drug. With the aim to derive free flowing and stable proliposome powder, fluid retention potential of various carriers was examined. Liposome formation and number of vesicles formed for per mm3 up on hydration, vesicle size, and entrapment efficiency was assessed to deduce an optimized formulation. Sodium cholate added to optimized formulation to induce surface charge on formed vesicles. Solid-state characterization (FTIR, DSC, and XRD) was performed with the intention to assess native crystalline and chemical behavior of drug. The in vitro dissolution test of optimized formulation along with pure drug was evaluated to estimate dissolution efficiency (DE) and relative dissolution rate (RDR). Effective permeability co-efficient (Peff(rat)) in rat and enhancement ratio (ER) of drug from formulation and pure drug dispersion were calculated from ex vivo permeation studies in rat ileum. Results: Proliposomal powder formulated with equimolar ratio of HSPC and cholesterol ensued in higher no. of vesicles (3.95) with 90% drug entrapment up on hydration. Neusilin UFL2 was elected as carrier because of its high fluid retention potential (4.5) and good flow properties. Proliposome powder exhibited augmentation in DE (60.3 ±3.34) and RDR (21.2±01.02) of agomelation over pure drug. Solid state characterization studies demonstrated the transformation of native crystalline form of drug to amorphous and/or molecular state, which was in correlation with results obtained from in vitro dissolution test. The elevated Peff(rat) of 46.5×10-4 cm/sec and ER of 2.65 of drug from charge induced proliposome formulation with respect to pure drug dispersion was assessed from ex vivo intestinal permeation studies executed in ileum of wistar rats. Conclusion: Improved physicochemical characters and ex vivo intestinal permeation of drug from charge induced proliposome powder with Neusilin UFL2 unravels the potentiality of this system in enhancing oral delivery of agomelatin.Keywords: agomelatin, proliposome, sodium cholate, neusilin
Procedia PDF Downloads 1361276 Review of Numerical Models for Granular Beds in Solar Rotary Kilns for Thermal Applications
Authors: Edgar Willy Rimarachin Valderrama, Eduardo Rojas Parra
Abstract:
Thermal energy from solar radiation is widely present in power plants, food drying, chemical reactors, heating and cooling systems, water treatment processes, hydrogen production, and others. In the case of power plants, one of the technologies available to transform solar energy into thermal energy is by solar rotary kilns where a bed of granular matter is heated through concentrated radiation obtained from an arrangement of heliostats. Numerical modeling is a useful approach to study the behavior of granular beds in solar rotary kilns. This technique, once validated with small-scale experiments, can be used to simulate large-scale processes for industrial applications. This study gives a comprehensive classification of numerical models used to simulate the movement and heat transfer for beds of granular media within solar rotary furnaces. In general, there exist three categories of models: 1) continuum, 2) discrete, and 3) multiphysics modeling. The continuum modeling considers zero-dimensional, one-dimensional and fluid-like models. On the other hand, the discrete element models compute the movement of each particle of the bed individually. In this kind of modeling, the heat transfer acts during contacts, which can occur by solid-solid and solid-gas-solid conduction. Finally, the multiphysics approach considers discrete elements to simulate grains and a continuous modeling to simulate the fluid around particles. This classification allows to compare the advantages and disadvantages for each kind of model in terms of accuracy, computational cost and implementation.Keywords: granular beds, numerical models, rotary kilns, solar thermal applications
Procedia PDF Downloads 341275 Investigation on Reducing the Bandgap in Nanocomposite Polymers by Doping
Authors: Sharvare Palwai, Padmaja Guggilla
Abstract:
Smart materials, also called as responsive materials, undergo reversible physical or chemical changes in their properties as a consequence of small environmental variations. They can respond to a single or multiple stimuli such as stress, temperature, moist, electric or magnetic fields, light, or chemical compounds. Hence smart materials are the basis of many applications, including biosensors and transducers, particularly electroactive polymers. As the polymers exhibit good flexibility, high transparency, easy processing, and low cost, they would be promising for the sensor material. Polyvinylidene Fluoride (PVDF), being a ferroelectric polymer, exhibits piezoelectric and pyro electric properties. Pyroelectric materials convert heat directly into electricity, while piezoelectric materials convert mechanical energy into electricity. These characteristics of PVDF make it useful in biosensor devices and batteries. However, the influence of nanoparticle fillers such as Lithium Tantalate (LiTaO₃/LT), Potassium Niobate (KNbO₃/PN), and Zinc Titanate (ZnTiO₃/ZT) in polymer films will be studied comprehensively. Developing advanced and cost-effective biosensors is pivotal to foresee the fullest potential of polymer based wireless sensor networks, which will further enable new types of self-powered applications. Finally, nanocomposites films with best set of properties; the sensory elements will be designed and tested for their performance as electric generators under laboratory conditions. By characterizing the materials for their optical properties and investigate the effects of doping on the bandgap energies, the science in the next-generation biosensor technologies can be advanced.Keywords: polyvinylidene fluoride, PVDF, lithium tantalate, potassium niobate, zinc titanate
Procedia PDF Downloads 1341274 Social and Digital Transformation of the Saudi Education System: A Cyberconflict Analysis
Authors: Mai Alshareef
Abstract:
The Saudi government considers the modernisation of the education system as a critical component of the national development plan, Saudi Vision 2030; however, this sudden reform creates tension amongst Saudis. This study examines first the reflection of the social and digital education reform on stakeholders and the general Saudi public, and second, the influence of information and communication technologies (ICTs) on the ethnoreligious conflict in Saudi Arabia. This study employs Cyberconflict theory to examine conflicts in the real world and cyberspace. The findings are based on a qualitative case study methodology that uses netnography, an analysis of 3,750 Twitter posts and semi-structural interviews with 30 individuals, including key actors in the Saudi education sector and Twitter activists during 2019\2020. The methods utilised are guided by thematic analysis to map an understanding of factors that influence societal conflicts in Saudi Arabia, which in this case include religious, national, and gender identity. Elements of Cyberconflict theory are used to better understand how conflicting groups build their identities in connection to their ethnic/religious/cultural differences and competing national identities. The findings correspond to the ethnoreligious components of the Cyberconflict theory. Twitter became a battleground for liberals, conservatives, the Saudi public and elites, and it is used in a novel way to influence public opinion and to challenge the media monopoly. Opposing groups relied heavily on a discourse of exclusion and inclusion and showed ethnic and religious affiliations, national identity, and chauvinism. The findings add to existing knowledge in the cyberconflict field of study, and they also reveal outcomes that are critical to the Saudi Arabian national context.Keywords: education, cyberconflict, Twitter, national identity
Procedia PDF Downloads 1741273 Systematic Identification of Noncoding Cancer Driver Somatic Mutations
Authors: Zohar Manber, Ran Elkon
Abstract:
Accumulation of somatic mutations (SMs) in the genome is a major driving force of cancer development. Most SMs in the tumor's genome are functionally neutral; however, some cause damage to critical processes and provide the tumor with a selective growth advantage (termed cancer driver mutations). Current research on functional significance of SMs is mainly focused on finding alterations in protein coding sequences. However, the exome comprises only 3% of the human genome, and thus, SMs in the noncoding genome significantly outnumber those that map to protein-coding regions. Although our understanding of noncoding driver SMs is very rudimentary, it is likely that disruption of regulatory elements in the genome is an important, yet largely underexplored mechanism by which somatic mutations contribute to cancer development. The expression of most human genes is controlled by multiple enhancers, and therefore, it is conceivable that regulatory SMs are distributed across different enhancers of the same target gene. Yet, to date, most statistical searches for regulatory SMs have considered each regulatory element individually, which may reduce statistical power. The first challenge in considering the cumulative activity of all the enhancers of a gene as a single unit is to map enhancers to their target promoters. Such mapping defines for each gene its set of regulating enhancers (termed "set of regulatory elements" (SRE)). Considering multiple enhancers of each gene as one unit holds great promise for enhancing the identification of driver regulatory SMs. However, the success of this approach is greatly dependent on the availability of comprehensive and accurate enhancer-promoter (E-P) maps. To date, the discovery of driver regulatory SMs has been hindered by insufficient sample sizes and statistical analyses that often considered each regulatory element separately. In this study, we analyzed more than 2,500 whole-genome sequence (WGS) samples provided by The Cancer Genome Atlas (TCGA) and The International Cancer Genome Consortium (ICGC) in order to identify such driver regulatory SMs. Our analyses took into account the combinatorial aspect of gene regulation by considering all the enhancers that control the same target gene as one unit, based on E-P maps from three genomics resources. The identification of candidate driver noncoding SMs is based on their recurrence. We searched for SREs of genes that are "hotspots" for SMs (that is, they accumulate SMs at a significantly elevated rate). To test the statistical significance of recurrence of SMs within a gene's SRE, we used both global and local background mutation rates. Using this approach, we detected - in seven different cancer types - numerous "hotspots" for SMs. To support the functional significance of these recurrent noncoding SMs, we further examined their association with the expression level of their target gene (using gene expression data provided by the ICGC and TCGA for samples that were also analyzed by WGS).Keywords: cancer genomics, enhancers, noncoding genome, regulatory elements
Procedia PDF Downloads 1041272 Assessing Significance of Correlation with Binomial Distribution
Authors: Vijay Kumar Singh, Pooja Kushwaha, Prabhat Ranjan, Krishna Kumar Ojha, Jitendra Kumar
Abstract:
Present day high-throughput genomic technologies, NGS/microarrays, are producing large volume of data that require improved analysis methods to make sense of the data. The correlation between genes and samples has been regularly used to gain insight into many biological phenomena including, but not limited to, co-expression/co-regulation, gene regulatory networks, clustering and pattern identification. However, presence of outliers and violation of assumptions underlying Pearson correlation is frequent and may distort the actual correlation between the genes and lead to spurious conclusions. Here, we report a method to measure the strength of association between genes. The method assumes that the expression values of a gene are Bernoulli random variables whose outcome depends on the sample being probed. The method considers the two genes as uncorrelated if the number of sample with same outcome for both the genes (Ns) is equal to certainly expected number (Es). The extent of correlation depends on how far Ns can deviate from the Es. The method does not assume normality for the parent population, fairly unaffected by the presence of outliers, can be applied to qualitative data and it uses the binomial distribution to assess the significance of association. At this stage, we would not claim about the superiority of the method over other existing correlation methods, but our method could be another way of calculating correlation in addition to existing methods. The method uses binomial distribution, which has not been used until yet, to assess the significance of association between two variables. We are evaluating the performance of our method on NGS/microarray data, which is noisy and pierce by the outliers, to see if our method can differentiate between spurious and actual correlation. While working with the method, it has not escaped our notice that the method could also be generalized to measure the association of more than two variables which has been proven difficult with the existing methods.Keywords: binomial distribution, correlation, microarray, outliers, transcriptome
Procedia PDF Downloads 4151271 The Advancement of Smart Cushion Product and System Design Enhancing Public Health and Well-Being at Workplace
Authors: Dosun Shin, Assegid Kidane, Pavan Turaga
Abstract:
According to the National Institute of Health, living a sedentary lifestyle leads to a number of health issues, including increased risk of cardiovascular dis-ease, type 2 diabetes, obesity, and certain types of cancers. This project brings together experts in multiple disciplines to bring product design, sensor design, algorithms, and health intervention studies to develop a product and system that helps reduce the amount of time sitting at the workplace. This paper illustrates ongoing improvements to prototypes the research team developed in initial research; including working prototypes with a software application, which were developed and demonstrated for users. Additional modifications were made to improve functionality, aesthetics, and ease of use, which will be discussed in this paper. Extending on the foundations created in the initial phase, our approach sought to further improve the product by conducting additional human factor research, studying deficiencies in competitive products, testing various materials/forms, developing working prototypes, and obtaining feedback from additional potential users. The solution consisted of an aesthetically pleasing seat cover cushion that easily attaches to common office chairs found in most workplaces, ensuring a wide variety of people can use the product. The product discreetly contains sensors that track when the user sits on their chair, sending information to a phone app that triggers reminders for users to stand up and move around after sitting for a set amount of time. This paper also presents the analyzed typical office aesthetics and selected materials, colors, and forms that complimented the working environment. Comfort and ease of use remained a high priority as the design team sought to provide a product and system that integrated into the workplace. As the research team continues to test, improve, and implement this solution for the sedentary workplace, the team seeks to create a viable product that acts as an impetus for a more active workday and lifestyle, further decreasing the proliferation of chronic disease and health issues for sedentary working people. This paper illustrates in detail the processes of engineering, product design, methodology, and testing results.Keywords: anti-sedentary work behavior, new product development, sensor design, health intervention studies
Procedia PDF Downloads 1581270 MRI Quality Control Using Texture Analysis and Spatial Metrics
Authors: Kumar Kanudkuri, A. Sandhya
Abstract:
Typically, in a MRI clinical setting, there are several protocols run, each indicated for a specific anatomy and disease condition. However, these protocols or parameters within them can change over time due to changes to the recommendations by the physician groups or updates in the software or by the availability of new technologies. Most of the time, the changes are performed by the MRI technologist to account for either time, coverage, physiological, or Specific Absorbtion Rate (SAR ) reasons. However, giving properly guidelines to MRI technologist is important so that they do not change the parameters that negatively impact the image quality. Typically a standard American College of Radiology (ACR) MRI phantom is used for Quality Control (QC) in order to guarantee that the primary objectives of MRI are met. The visual evaluation of quality depends on the operator/reviewer and might change amongst operators as well as for the same operator at various times. Therefore, overcoming these constraints is essential for a more impartial evaluation of quality. This makes quantitative estimation of image quality (IQ) metrics for MRI quality control is very important. So in order to solve this problem, we proposed that there is a need for a robust, open-source, and automated MRI image control tool. The Designed and developed an automatic analysis tool for measuring MRI image quality (IQ) metrics like Signal to Noise Ratio (SNR), Signal to Noise Ratio Uniformity (SNRU), Visual Information Fidelity (VIF), Feature Similarity (FSIM), Gray level co-occurrence matrix (GLCM), slice thickness accuracy, slice position accuracy, High contrast spatial resolution) provided good accuracy assessment. A standardized quality report has generated that incorporates metrics that impact diagnostic quality.Keywords: ACR MRI phantom, MRI image quality metrics, SNRU, VIF, FSIM, GLCM, slice thickness accuracy, slice position accuracy
Procedia PDF Downloads 1701269 Supply Chain Analysis with Product Returns: Pricing and Quality Decisions
Authors: Mingming Leng
Abstract:
Wal-Mart has allocated considerable human resources for its quality assurance program, in which the largest retailer serves its supply chains as a quality gatekeeper. Asda Stores Ltd., the second largest supermarket chain in Britain, is now investing £27m in significantly increasing the frequency of quality control checks in its supply chains and thus enhancing quality across its fresh food business. Moreover, Tesco, the largest British supermarket chain, already constructed a quality assessment center to carry out its gatekeeping responsibility. Motivated by the above practices, we consider a supply chain in which a retailer plays the gatekeeping role in quality assurance by identifying defects among a manufacturer's products prior to selling them to consumers. The impact of a retailer's gatekeeping activity on pricing and quality assurance in a supply chain has not been investigated in the operations management area. We draw a number of managerial insights that are expected to help practitioners judiciously consider the quality gatekeeping effort at the retail level. As in practice, when the retailer identifies a defective product, she immediately returns it to the manufacturer, who then replaces the defect with a good quality product and pays a penalty to the retailer. If the retailer does not recognize a defect but sells it to a consumer, then the consumer will identify the defect and return it to the retailer, who then passes the returned 'unidentified' defect to the manufacturer. The manufacturer also incurs a penalty cost. Accordingly, we analyze a two-stage pricing and quality decision problem, in which the manufacturer and the retailer bargain over the manufacturer's average defective rate and wholesale price at the first stage, and the retailer decides on her optimal retail price and gatekeeping intensity at the second stage. We also compare the results when the retailer performs quality gatekeeping with those when the retailer does not. Our supply chain analysis exposes some important managerial insights. For example, the retailer's quality gatekeeping can effectively reduce the channel-wide defective rate, if her penalty charge for each identified de-fect is larger than or equal to the market penalty for each unidentified defect. When the retailer imple-ments quality gatekeeping, the change in the negotiated wholesale price only depends on the manufac-turer's 'individual' benefit, and the change in the retailer's optimal retail price is only related to the channel-wide benefit. The retailer is willing to take on the quality gatekeeping responsibility, when the impact of quality relative to retail price on demand is high and/or the retailer has a strong bargaining power. We conclude that the retailer's quality gatekeeping can help reduce the defective rate for consumers, which becomes more significant when the retailer's bargaining position in her supply chain is stronger. Retailers with stronger bargaining powers can benefit more from their quality gatekeeping in supply chains.Keywords: bargaining, game theory, pricing, quality, supply chain
Procedia PDF Downloads 2771268 Iranian English as Foreign Language Teachers' Psychological Well-Being across Gender: During the Pandemic
Authors: Fatemeh Asadi Farsad, Sima Modirkhameneh
Abstract:
The purpose of this study was to explore the pattern of Psychological Well-Being (PWB) of Iranian male and female EFL teachers during the pandemic. It was intended to see if such a drastic change in the context and mode of teaching affects teachers' PWB. Furthermore, the possible difference between the six elements of PWB of Iranian EFL male vs. female teachers during the pandemic was investigated. The other purpose was to find out the EFL teachers’ perceptions of any modifications, and factors leading to such modifications in their PWB during pandemic. For the purpose of this investigation, a total of 81 EFL teachers (59 female, 22 male) with an age range of 25 to 35 were conveniently sampled from different cities in Iran. Ryff’s PWB questionnaire was sent to participant teachers through online platforms to elicit data on their PWB. As for their perceptions on the possible modifications and the factors involved in PWB during pandemic, a set of semi-structured interviews were run among both sample groups. The findings revealed that male EFL teachers had the highest mean on personal growth, followed by purpose of life, and self-acceptance and the lowest mean on environmental mastery. With a slightly similar pattern, female EFL teachers had the highest mean on personal growth, followed by purpose in life, and positive relationship with others with the lowest mean on environmental mastery. However, no significant difference was observed between the male and female groups’ overall means on elements of PWB. Additionally, participants perceived that their anxiety level in online classes altered due to factors like (1) Computer literacy skills, (2) Lack of social communications and interactions with colleagues and students, (3) Online class management, (4) Overwhelming workloads, and (5) Time management. The study ends with further suggestions as regards effective online teaching preparation considering teachers PWB, especially at severe situations such as covid-19 pandemic. The findings offer to determine the reformations of educational policies concerning enhancing EFL teachers’ PWB through computer literacy courses and stress management courses. It is also suggested that to proactively support teachers’ mental health, it is necessary to provide them with advisors and psychologists if possible for free. Limitations: One limitation is the small number of participants (81), suggesting that future replications should include more participants for reliable findings. Another limitation is the gender imbalance, which future studies should address to yield better outcomes. Furthermore, Limited data gathering tools suggest using observations, diaries, and narratives for more insights in future studies. The study focused on one model of PWB, calling for further research on other models in the literature. Considering the wide effect of the COVID-19 pandemic, future studies should consider additional variables (e.g., teaching experience, age, income) to understand Iranian EFL teachers’ vulnerabilities and strengths better.Keywords: online teaching, psychological well-being, female and male EFL teachers, pandemic
Procedia PDF Downloads 471267 Tokyo Skyscrapers: Technologically Advanced Structures in Seismic Areas
Authors: J. Szolomicki, H. Golasz-Szolomicka
Abstract:
The architectural and structural analysis of selected high-rise buildings in Tokyo is presented in this paper. The capital of Japan is the most densely populated city in the world and moreover is located in one of the most active seismic zones. The combination of these factors has resulted in the creation of sophisticated designs and innovative engineering solutions, especially in the field of design and construction of high-rise buildings. The foreign architectural studios (as, for Jean Nouvel, Kohn Pedesen Associates, Skidmore, Owings & Merill) which specialize in the designing of skyscrapers, played a major role in the development of technological ideas and architectural forms for such extraordinary engineering structures. Among the projects completed by them, there are examples of high-rise buildings that set precedents for future development. An essential aspect which influences the design of high-rise buildings is the necessity to take into consideration their dynamic reaction to earthquakes and counteracting wind vortices. The need to control motions of these buildings, induced by the force coming from earthquakes and wind, led to the development of various methods and devices for dissipating energy which occur during such phenomena. Currently, Japan is a global leader in seismic technologies which safeguard seismic influence on high-rise structures. Due to these achievements the most modern skyscrapers in Tokyo are able to withstand earthquakes with a magnitude of over seven degrees at the Richter scale. Damping devices applied are of a passive, which do not require additional power supply or active one which suppresses the reaction with the input of extra energy. In recent years also hybrid dampers were used, with an additional active element to improve the efficiency of passive damping.Keywords: core structures, damping system, high-rise building, seismic zone
Procedia PDF Downloads 1751266 Machine Learning-Based Techniques for Detecting and Mitigating Cyber-attacks on Automatic Generation Control in Smart Grids
Authors: Sami M. Alshareef
Abstract:
The rapid growth of smart grid technology has brought significant advancements to the power industry. However, with the increasing interconnectivity and reliance on information and communication technologies, smart grids have become vulnerable to cyber-attacks, posing significant threats to the reliable operation of power systems. Among the critical components of smart grids, the Automatic Generation Control (AGC) system plays a vital role in maintaining the balance between generation and load demand. Therefore, protecting the AGC system from cyber threats is of paramount importance to maintain grid stability and prevent disruptions. Traditional security measures often fall short in addressing sophisticated and evolving cyber threats, necessitating the exploration of innovative approaches. Machine learning, with its ability to analyze vast amounts of data and learn patterns, has emerged as a promising solution to enhance AGC system security. Therefore, this research proposal aims to address the challenges associated with detecting and mitigating cyber-attacks on AGC in smart grids by leveraging machine learning techniques on automatic generation control of two-area power systems. By utilizing historical data, the proposed system will learn the normal behavior patterns of AGC and identify deviations caused by cyber-attacks. Once an attack is detected, appropriate mitigation strategies will be employed to safeguard the AGC system. The outcomes of this research will provide power system operators and administrators with valuable insights into the vulnerabilities of AGC systems in smart grids and offer practical solutions to enhance their cyber resilience.Keywords: machine learning, cyber-attacks, automatic generation control, smart grid
Procedia PDF Downloads 851265 Open Innovation Laboratory for Rapid Realization of Sensing, Smart and Sustainable Products (S3 Products) for Higher Education
Authors: J. Miranda, D. Chavarría-Barrientos, M. Ramírez-Cadena, M. E. Macías, P. Ponce, J. Noguez, R. Pérez-Rodríguez, P. K. Wright, A. Molina
Abstract:
Higher education methods need to evolve because the new generations of students are learning in different ways. One way is by adopting emergent technologies, new learning methods and promoting the maker movement. As a result, Tecnologico de Monterrey is developing Open Innovation Laboratories as an immediate response to educational challenges of the world. This paper presents an Open Innovation Laboratory for Rapid Realization of Sensing, Smart and Sustainable Products (S3 Products). The Open Innovation Laboratory is composed of a set of specific resources where students and teachers use them to provide solutions to current problems of priority sectors through the development of a new generation of products. This new generation of products considers the concepts Sensing, Smart, and Sustainable. The Open Innovation Laboratory has been implemented in different courses in the context of New Product Development (NPD) and Integrated Manufacturing Systems (IMS) at Tecnologico de Monterrey. The implementation consists of adapting this Open Innovation Laboratory within the course’s syllabus in combination with the implementation of specific methodologies for product development, learning methods (Active Learning and Blended Learning using Massive Open Online Courses MOOCs) and rapid product realization platforms. Using the concepts proposed it is possible to demonstrate that students can propose innovative and sustainable products, and demonstrate how the learning process could be improved using technological resources applied in the higher educational sector. Finally, examples of innovative S3 products developed at Tecnologico de Monterrey are presented.Keywords: active learning, blended learning, maker movement, new product development, open innovation laboratory
Procedia PDF Downloads 3951264 Investigating the Dimensions of Perceived Attributions in Making Sense of Failure: An Exploratory Study of Lebanese Entrepreneurs
Authors: Ghiwa Dandach
Abstract:
By challenging the anti-failure bias and contributing to the theoretical territory of the attribution theory, this thesis develops a comprehensive process for entrepreneurial learning from failure. The practical implication of the findings suggests assisting entrepreneurs (current, failing, and nascent) in effectively anticipating and reflecting upon failure. Additionally, the process is suggested to enhance the level of institutional and private (accelerators and financers) support provided to entrepreneurs, the implications of which may improve future opportunities for entrepreneurial success. Henceforth, exploring learning from failure is argued to impact the potential survival of future ventures, subsequently revitalizing the economic contribution of entrepreneurship. This learning process can be enhanced with the cognitive development of causal ascriptions for failure, which eventually impacts learning outcomes. However, the mechanism with which entrepreneurs make sense of failure, reflect on the journey, and transform experience into knowledge is still under-researched. More specifically, the cognitive process of failure attribution is under-explored, majorly in the context of developing economies, calling for a more insightful understanding on how entrepreneurs ascribe failure. Responding to the call for more thorough research in such cultural contexts, this study expands the understanding of the dimensions of failure attributions as perceived by entrepreneurs and the impact of these dimensions on learning outcomes in the Lebanese context. The research adopted the exploratory interpretivism paradigm and collected data from interviews with industry experts first, followed by narratives of entrepreneurs using the qualitative multimethod approach. The holistic and categorical content analysis of narratives, preceded by the thematic analysis of interviews, unveiled how entrepreneurs ascribe failure by developing minor and major dimensions of each failure attribution. The findings have also revealed how each dimension impacts the learning from failure when accompanied by emotional resilience. The thesis concludes that exploring in-depth the dimensions of failure attributions significantly determines the level of learning generated. They are moving beyond the simple categorisation of ascriptions as primary internal or external unveiled how learning may occur with each attribution at the individual, venture, and ecosystem levels. This has further accentuated that a major internal attribution of failure combined with a minor external attribution generated the highest levels of transformative and double-loop learning, emphasizing the role of personal blame and responsibility on enhancing learning outcomes.Keywords: attribution, entrepreneurship, reflection, sense-making, emotions, learning outcomes, failure, exit
Procedia PDF Downloads 227