Search results for: cloud computing framework
4577 Integrating Explicit Instruction and Problem-Solving Approaches for Efficient Learning
Authors: Slava Kalyuga
Abstract:
There are two opposing major points of view on the optimal degree of initial instructional guidance that is usually discussed in the literature by the advocates of the corresponding learning approaches. Using unguided or minimally guided problem-solving tasks prior to explicit instruction has been suggested by productive failure and several other instructional theories, whereas an alternative approach - using fully guided worked examples followed by problem solving - has been demonstrated as the most effective strategy within the framework of cognitive load theory. An integrated approach discussed in this paper could combine the above frameworks within a broader theoretical perspective which would allow bringing together their best features and advantages in the design of learning tasks for STEM education. This paper represents a systematic review of the available empirical studies comparing the above alternative sequences of instructional methods to explore effects of several possible moderating factors. The paper concludes that different approaches and instructional sequences should coexist within complex learning environments. Selecting optimal sequences depends on such factors as specific goals of learner activities, types of knowledge to learn, levels of element interactivity (task complexity), and levels of learner prior knowledge. This paper offers an outline of a theoretical framework for the design of complex learning tasks in STEM education that would integrate explicit instruction and inquiry (exploratory, discovery) learning approaches in ways that depend on a set of defined specific factors.Keywords: cognitive load, explicit instruction, exploratory learning, worked examples
Procedia PDF Downloads 1264576 Improving Psychological Safety in Teaching and Social Organizations in Finland
Authors: Eija Raatikainen
Abstract:
The aim of the study is to examine psychological safety in the context of change in working life and continuous learning in social- and educational organizations. The participants in the study are social workers and vocational teachers working as employees and supervisors in the capital region of Finland (public and private sectors). Research data has been collected during 2022-2023 using the qualitative method called empathy-based stories (MEBS). Research participants were asked to write short stories about situations related to their work and work community. As researchers, we created and varied the framework narratives (MEBS) in line with the aim of the study and theoretical background. The data were analyzed with content analysis. According to the results, the barriers and prerequisites for psychological safety at work could be located in four different working culture dimensions. The work culture dimensions were named as follows: 1) a work culture focusing on interaction and emotional culture between colleagues, 2) communal work culture, 3) a work culture that enables learning, and 4) a work culture focused on structures and operating models. All these have detailed elements of barriers and prerequisites of psychological safety at work. The results derived from the enlivening methods can be utilized when working with the work community and have discussed psychological safety at work. Also, the method itself (MEBS) can prevent open discussion and reflection on psychological safety at work because of the sensitivity of the topic. Method aloud to imagine, not just talk and share your experiences directly. Additionally, the results of the study can offer one tool or framework while developing phycological safety at work.Keywords: psychological safety, empathy, empathy-based stories, working life
Procedia PDF Downloads 724575 Searching k-Nearest Neighbors to be Appropriate under Gaming Environments
Authors: Jae Moon Lee
Abstract:
In general, algorithms to find continuous k-nearest neighbors have been researched on the location based services, monitoring periodically the moving objects such as vehicles and mobile phone. Those researches assume the environment that the number of query points is much less than that of moving objects and the query points are not moved but fixed. In gaming environments, this problem is when computing the next movement considering the neighbors such as flocking, crowd and robot simulations. In this case, every moving object becomes a query point so that the number of query point is same to that of moving objects and the query points are also moving. In this paper, we analyze the performance of the existing algorithms focused on location based services how they operate under gaming environments.Keywords: flocking behavior, heterogeneous agents, similarity, simulation
Procedia PDF Downloads 3024574 Reversible Information Hitting in Encrypted JPEG Bitstream by LSB Based on Inherent Algorithm
Authors: Vaibhav Barve
Abstract:
Reversible information hiding has drawn a lot of interest as of late. Being reversible, we can restore unique computerized data totally. It is a plan where mystery data is put away in digital media like image, video, audio to maintain a strategic distance from unapproved access and security reason. By and large JPEG bit stream is utilized to store this key data, first JPEG bit stream is encrypted into all around sorted out structure and then this secret information or key data is implanted into this encrypted region by marginally changing the JPEG bit stream. Valuable pixels suitable for information implanting are computed and as indicated by this key subtle elements are implanted. In our proposed framework we are utilizing RC4 algorithm for encrypting JPEG bit stream. Encryption key is acknowledged by framework user which, likewise, will be used at the time of decryption. We are executing enhanced least significant bit supplanting steganography by utilizing genetic algorithm. At first, the quantity of bits that must be installed in a guaranteed coefficient is versatile. By utilizing proper parameters, we can get high capacity while ensuring high security. We are utilizing logistic map for shuffling of bits and utilization GA (Genetic Algorithm) to find right parameters for the logistic map. Information embedding key is utilized at the time of information embedding. By utilizing precise picture encryption and information embedding key, the beneficiary can, without much of a stretch, concentrate the incorporated secure data and totally recoup the first picture and also the original secret information. At the point when the embedding key is truant, the first picture can be recouped pretty nearly with sufficient quality without getting the embedding key of interest.Keywords: data embedding, decryption, encryption, reversible data hiding, steganography
Procedia PDF Downloads 2884573 Soliton Interaction in Multi-Core Optical Fiber: Application to WDM System
Authors: S. Arun Prakash, V. Malathi, M. S. Mani Rajan
Abstract:
The analytical bright two soliton solution of the 3-coupled nonlinear Schrödinger equations with variable coefficients in birefringent optical fiber is obtained by Darboux transformation method. To the design of ultra-speed optical devices, Soliton interaction and control in birefringence fiber is investigated. Lax pair is constructed for N coupled NLS system through AKNS method. Using two soliton solution, we demonstrate different interaction behaviors of solitons in birefringent fiber depending on the choice of control parameters. Our results shows that interactions of optical solitons have some specific applications such as construction of logic gates, optical computing, soliton switching, and soliton amplification in wavelength division multiplexing (WDM) system.Keywords: optical soliton, soliton interaction, soliton switching, WDM
Procedia PDF Downloads 5054572 Obesity and Cancer: Current Scientific Evidence and Policy Implications
Authors: Martin Wiseman, Rachel Thompson, Panagiota Mitrou, Kate Allen
Abstract:
Since 1997 World Cancer Research Fund (WCRF) International and the American Institute for Cancer Research (AICR) have been at the forefront of synthesising and interpreting the accumulated scientific literature on the link between diet, nutrition, physical activity and cancer, and deriving evidence-based Cancer Prevention Recommendations. The 2007 WCRF/AICR 2nd Expert Report was a landmark in the analysis of evidence linking diet, body weight and physical activity to cancer and led to the establishment of the Continuous Update Project (CUP). In 2018, as part of the CUP, WCRF/AICR will publish a new synthesis of the current evidence and update the Cancer Prevention Recommendations. This will ensure that everyone - from policymakers and health professionals to members of the public - has access to the most up-to-date information on how to reduce the risk of developing cancer. Overweight and obesity play a significant role in cancer risk, and rates of both are increasing in many parts of the world. This session will give an overview of new evidence relating obesity to cancer since the 2007 report. For example, since the 2007 Report, the number of cancers for which obesity is judged to be a contributory cause has increased from seven to eleven. The session will also shed light on the well-established mechanisms underpinning obesity and cancer links. Additionally, the session will provide an overview of diet and physical activity related factors that promote positive energy imbalance, leading to overweight and obesity. Finally, the session will highlight how policy can be used to address overweight and obesity at a population level, using WCRF International’s NOURISHING Framework. NOURISHING formalises a comprehensive package of policies to promote healthy diets and reduce obesity and non-communicable diseases; it is a tool for policymakers to identify where action is needed and assess if an approach is sufficiently comprehensive. The framework brings together ten policy areas across three domains: food environment, food system, and behaviour change communication. The framework is accompanied by a regularly updated database providing an extensive overview of implemented government policy actions from around the world. In conclusion, the session will provide an overview of obesity and cancer, highlighting the links seen in the epidemiology and exploring the mechanisms underpinning these, as well as the influences that help determine overweight and obesity. Finally, the session will illustrate policy approaches that can be taken to reduce overweight and obesity worldwide.Keywords: overweight, obesity, nutrition, cancer, mechanisms, policy
Procedia PDF Downloads 1574571 Application of Harris Hawks Optimization Metaheuristic Algorithm and Random Forest Machine Learning Method for Long-Term Production Scheduling Problem under Uncertainty in Open-Pit Mines
Authors: Kamyar Tolouei, Ehsan Moosavi
Abstract:
In open-pit mines, the long-term production scheduling optimization problem (LTPSOP) is a complicated problem that contains constraints, large datasets, and uncertainties. Uncertainty in the output is caused by several geological, economic, or technical factors. Due to its dimensions and NP-hard nature, it is usually difficult to find an ideal solution to the LTPSOP. The optimal schedule generally restricts the ore, metal, and waste tonnages, average grades, and cash flows of each period. Past decades have witnessed important measurements of long-term production scheduling and optimal algorithms since researchers have become highly cognizant of the issue. In fact, it is not possible to consider LTPSOP as a well-solved problem. Traditional production scheduling methods in open-pit mines apply an estimated orebody model to produce optimal schedules. The smoothing result of some geostatistical estimation procedures causes most of the mine schedules and production predictions to be unrealistic and imperfect. With the expansion of simulation procedures, the risks from grade uncertainty in ore reserves can be evaluated and organized through a set of equally probable orebody realizations. In this paper, to synthesize grade uncertainty into the strategic mine schedule, a stochastic integer programming framework is presented to LTPSOP. The objective function of the model is to maximize the net present value and minimize the risk of deviation from the production targets considering grade uncertainty simultaneously while satisfying all technical constraints and operational requirements. Instead of applying one estimated orebody model as input to optimize the production schedule, a set of equally probable orebody realizations are applied to synthesize grade uncertainty in the strategic mine schedule and to produce a more profitable and risk-based production schedule. A mixture of metaheuristic procedures and mathematical methods paves the way to achieve an appropriate solution. This paper introduced a hybrid model between the augmented Lagrangian relaxation (ALR) method and the metaheuristic algorithm, the Harris Hawks optimization (HHO), to solve the LTPSOP under grade uncertainty conditions. In this study, the HHO is experienced to update Lagrange coefficients. Besides, a machine learning method called Random Forest is applied to estimate gold grade in a mineral deposit. The Monte Carlo method is used as the simulation method with 20 realizations. The results specify that the progressive versions have been considerably developed in comparison with the traditional methods. The outcomes were also compared with the ALR-genetic algorithm and ALR-sub-gradient. To indicate the applicability of the model, a case study on an open-pit gold mining operation is implemented. The framework displays the capability to minimize risk and improvement in the expected net present value and financial profitability for LTPSOP. The framework could control geological risk more effectively than the traditional procedure considering grade uncertainty in the hybrid model framework.Keywords: grade uncertainty, metaheuristic algorithms, open-pit mine, production scheduling optimization
Procedia PDF Downloads 1054570 Vision-Based Collision Avoidance for Unmanned Aerial Vehicles by Recurrent Neural Networks
Authors: Yao-Hong Tsai
Abstract:
Due to the sensor technology, video surveillance has become the main way for security control in every big city in the world. Surveillance is usually used by governments for intelligence gathering, the prevention of crime, the protection of a process, person, group or object, or the investigation of crime. Many surveillance systems based on computer vision technology have been developed in recent years. Moving target tracking is the most common task for Unmanned Aerial Vehicle (UAV) to find and track objects of interest in mobile aerial surveillance for civilian applications. The paper is focused on vision-based collision avoidance for UAVs by recurrent neural networks. First, images from cameras on UAV were fused based on deep convolutional neural network. Then, a recurrent neural network was constructed to obtain high-level image features for object tracking and extracting low-level image features for noise reducing. The system distributed the calculation of the whole system to local and cloud platform to efficiently perform object detection, tracking and collision avoidance based on multiple UAVs. The experiments on several challenging datasets showed that the proposed algorithm outperforms the state-of-the-art methods.Keywords: unmanned aerial vehicle, object tracking, deep learning, collision avoidance
Procedia PDF Downloads 1604569 Human Activities Recognition Based on Expert System
Authors: Malika Yaici, Soraya Aloui, Sara Semchaoui
Abstract:
Recognition of human activities from sensor data is an active research area, and the main objective is to obtain a high recognition rate. In this work, we propose a recognition system based on expert systems. The proposed system makes the recognition based on the objects, object states, and gestures, taking into account the context (the location of the objects and of the person performing the activity, the duration of the elementary actions, and the activity). This work focuses on complex activities which are decomposed into simple easy to recognize activities. The proposed method can be applied to any type of activity. The simulation results show the robustness of our system and its speed of decision.Keywords: human activity recognition, ubiquitous computing, context-awareness, expert system
Procedia PDF Downloads 1404568 An Analysis of Legal and Ethical Implications of Sports Doping in India
Authors: Prathyusha Samvedam, Hiranmaya Nanda
Abstract:
Doping refers to the practice of using drugs or practices that enhance an athlete's performance. This is a problem that occurs on a worldwide scale and compromises the fairness of athletic tournaments. There are rules that have been created on both the national and international levels in order to prevent doping. However, these rules sometimes contradict one another, and it is possible that they don't do a very good job of prohibiting people from using PEDs. This study will contend that India's inability to comply with specific Code criteria, as well as its failure to satisfy "best practice" standards established by other countries, demonstrates a lack of uniformity in the implementation of anti-doping regulations and processes among nations. Such challenges have the potential to undermine the validity of the anti-doping system, particularly in developing nations like India. This article on the legislative framework in India governing doping in sports is very important. To begin, doping in sports is a significant problem that affects the spirit of fair play and sportsmanship. Moreover, it has the potential to jeopardize the integrity of the sport itself. In addition, the research has the potential to educate policymakers, sports organizations, and other stakeholders about the current legal framework and how well it discourages doping in athletic competitions. This article is divided into four distinct sections. The first section offers an explanation of what doping is and provides some context about its development throughout time. Followed the role of anti-doping authorities and the responsibilities they perform are investigated. Case studies and the research technique that will be employed for the study are in the third section; finally, the results are presented in the last section. In conclusion, doping is a severe problem that endangers the honest competition that exists within sports.Keywords: sports law, doping, NADA, WADA, performance enhancing drugs, anti-doping bill 2022
Procedia PDF Downloads 724567 Corporate Social Responsibility: An Ethical or a Legal Framework?
Authors: Pouira Askary
Abstract:
Indeed, in our globalized world which is facing with various international crises, the transnational corporations and other business enterprises have the capacity to foster economic well-being, development, technological improvement and wealth, as well as causing adverse impacts on human rights. The UN Human Rights Council declared that although the primary responsibility to protect human rights lie with the State but the transnational corporations and other business enterprises have also a responsibility to respect and protect human rights in the framework of corporate social responsibility. In 2011, the Human Rights Council endorsed the Guiding Principles on Business and Human Rights, a set of guidelines that define the key duties and responsibilities of States and business enterprises with regard to business-related human rights abuses. In UN’s view, the Guiding Principles do not create new legal obligations but constitute a clarification of the implications of existing standards, including under international human rights law. In 2014 the UN Human Rights Council decided to establish a working group on transnational corporations and other business enterprises whose mandate shall be to elaborate an international legally binding instrument to regulate, in international human rights law, the activities of transnational corporations and other business enterprises. Extremely difficult task for the working group to codify a legally binding document to regulate the behavior of corporations on the basis of the norms of international law! Concentration of this paper is on the origins of those human rights applicable on business enterprises. The research will discuss that the social and ethical roots of the CSR are much more institutionalized and elaborated than the legal roots. Therefore, the first step is to determine whether and to what extent corporations, do have an ethical responsibility to respect human rights and if so, by which means this ethical and social responsibility is convertible to legal commitments.Keywords: CSR, ethics, international law, human rights, development, sustainable business
Procedia PDF Downloads 3864566 An Interactive Institutional Framework for Evolution of Enterprise Technological Innovation Capabilities System: A Complex Adaptive Systems Approach
Authors: Sohail Ahmed, Ke Xing
Abstract:
This research theoretically explored the evolution mechanism of enterprise technological innovation capability system (ETICS) from the perspective of complex adaptive systems (CAS). This research proposed an analytical framework for ETICS, its concepts, and theory by integrating CAS methodology into the management of the technological innovation capability of enterprises and discusses how to use the principles of complexity to analyze the composition, evolution, and realization of the technological innovation capabilities in complex dynamic environments. This paper introduces the concept and interaction of multi-agent, the theoretical background of CAS, and summarizes the sources of technological innovation, the elements of each subject, and the main clusters of adaptive interactions and innovation activities. The concept of multi-agents is applied through the linkages of enterprises, research institutions, and government agencies with the leading enterprises in industrial settings. The study was exploratory and based on CAS theory. Theoretical model is built by considering technological and innovation literature from foundational to state of the art projects of technological enterprises. On this basis, the theoretical model is developed to measure the evolution mechanism of the enterprise's technological innovation capability system. This paper concludes that the main characteristics for evolution in technological systems are based on the enterprise’s research and development personnel, investments in technological processes, and innovation resources are responsible for the evolution of enterprise technological innovation performance. The research specifically enriched the application process of technological innovation in institutional networks related to enterprises.Keywords: complex adaptive system, echo model, enterprise technological innovation capability system, research institutions, multi-agents
Procedia PDF Downloads 1374565 Gender Responsiveness of Water, Sanitation Policies and Legal Frameworks at Makerere University
Authors: Harriet Kebirungi, Majaliwa Jackson-Gilbert Mwanjalolo, S. Livingstone Luboobi, Richard Joseph Kimwaga, Consolata Kabonesa
Abstract:
This paper assessed gender responsiveness of water and sanitation policies and legal frameworks at Makerere University, Uganda. The objectives of the study were to i) examine the gender responsiveness of water and sanitation related policies and frameworks implemented at Makerere University; and ii) assess the challenges faced by the University in customizing national water and sanitation policies and legal frameworks into University policies. A cross-sectional gender-focused study design was adopted. A checklist was developed to analyze national water and sanitation policies and legal frameworks and University based policies. In addition, primary data was obtained from Key informants at the Ministry of Water and Environment and Makerere University. A gender responsive five-step analytical framework was used to analyze the collected data. Key findings indicated that the policies did not adequately address issues of gender, water and sanitation and the policies were gender neutral consistently. The national policy formulation process was found to be gender blind and not backed by situation analysis of different stakeholders including higher education institutions like Universities. At Makerere University, due to lack of customized and gender responsive water and sanitation policy and implementation framework, there were gender differences and deficiencies in access to and utilization of water and sanitation facilities. The University should take advantage of existing expertise within them to customize existing national water policies and gender, and water and sanitation sub-sector strategy. This will help the University to design gender responsive, culturally acceptable and environmental friendly water and sanitation systems that provide adequate water and sanitation facilities that address the needs and interests of male and female students.Keywords: gender, Makerere University, policies, water, sanitation
Procedia PDF Downloads 4034564 A Nonlocal Means Algorithm for Poisson Denoising Based on Information Geometry
Authors: Dongxu Chen, Yipeng Li
Abstract:
This paper presents an information geometry NonlocalMeans(NLM) algorithm for Poisson denoising. NLM estimates a noise-free pixel as a weighted average of image pixels, where each pixel is weighted according to the similarity between image patches in Euclidean space. In this work, every pixel is a Poisson distribution locally estimated by Maximum Likelihood (ML), all distributions consist of a statistical manifold. A NLM denoising algorithm is conducted on the statistical manifold where Fisher information matrix can be used for computing distribution geodesics referenced as the similarity between patches. This approach was demonstrated to be competitive with related state-of-the-art methods.Keywords: image denoising, Poisson noise, information geometry, nonlocal-means
Procedia PDF Downloads 2854563 Incorporating Lexical-Semantic Knowledge into Convolutional Neural Network Framework for Pediatric Disease Diagnosis
Authors: Xiaocong Liu, Huazhen Wang, Ting He, Xiaozheng Li, Weihan Zhang, Jian Chen
Abstract:
The utilization of electronic medical record (EMR) data to establish the disease diagnosis model has become an important research content of biomedical informatics. Deep learning can automatically extract features from the massive data, which brings about breakthroughs in the study of EMR data. The challenge is that deep learning lacks semantic knowledge, which leads to impracticability in medical science. This research proposes a method of incorporating lexical-semantic knowledge from abundant entities into a convolutional neural network (CNN) framework for pediatric disease diagnosis. Firstly, medical terms are vectorized into Lexical Semantic Vectors (LSV), which are concatenated with the embedded word vectors of word2vec to enrich the feature representation. Secondly, the semantic distribution of medical terms serves as Semantic Decision Guide (SDG) for the optimization of deep learning models. The study evaluate the performance of LSV-SDG-CNN model on four kinds of Chinese EMR datasets. Additionally, CNN, LSV-CNN, and SDG-CNN are designed as baseline models for comparison. The experimental results show that LSV-SDG-CNN model outperforms baseline models on four kinds of Chinese EMR datasets. The best configuration of the model yielded an F1 score of 86.20%. The results clearly demonstrate that CNN has been effectively guided and optimized by lexical-semantic knowledge, and LSV-SDG-CNN model improves the disease classification accuracy with a clear margin.Keywords: convolutional neural network, electronic medical record, feature representation, lexical semantics, semantic decision
Procedia PDF Downloads 1264562 RV-YOLOX: Object Detection on Inland Waterways Based on Optimized YOLOX Through Fusion of Vision and 3+1D Millimeter Wave Radar
Authors: Zixian Zhang, Shanliang Yao, Zile Huang, Zhaodong Wu, Xiaohui Zhu, Yong Yue, Jieming Ma
Abstract:
Unmanned Surface Vehicles (USVs) are valuable due to their ability to perform dangerous and time-consuming tasks on the water. Object detection tasks are significant in these applications. However, inherent challenges, such as the complex distribution of obstacles, reflections from shore structures, water surface fog, etc., hinder the performance of object detection of USVs. To address these problems, this paper provides a fusion method for USVs to effectively detect objects in the inland surface environment, utilizing vision sensors and 3+1D Millimeter-wave radar. MMW radar is complementary to vision sensors, providing robust environmental information. The radar 3D point cloud is transferred to 2D radar pseudo image to unify radar and vision information format by utilizing the point transformer. We propose a multi-source object detection network (RV-YOLOX )based on radar-vision fusion for inland waterways environment. The performance is evaluated on our self-recording waterways dataset. Compared with the YOLOX network, our fusion network significantly improves detection accuracy, especially for objects with bad light conditions.Keywords: inland waterways, YOLO, sensor fusion, self-attention
Procedia PDF Downloads 1244561 Fuzzy Optimization for Identifying Anticancer Targets in Genome-Scale Metabolic Models of Colon Cancer
Authors: Feng-Sheng Wang, Chao-Ting Cheng
Abstract:
Developing a drug from conception to launch is costly and time-consuming. Computer-aided methods can reduce research costs and accelerate the development process during the early drug discovery and development stages. This study developed a fuzzy multi-objective hierarchical optimization framework for identifying potential anticancer targets in a metabolic model. First, RNA-seq expression data of colorectal cancer samples and their healthy counterparts were used to reconstruct tissue-specific genome-scale metabolic models. The aim of the optimization framework was to identify anticancer targets that lead to cancer cell death and evaluate metabolic flux perturbations in normal cells that have been caused by cancer treatment. Four objectives were established in the optimization framework to evaluate the mortality of cancer cells for treatment and to minimize side effects causing toxicity-induced tumorigenesis on normal cells and smaller metabolic perturbations. Through fuzzy set theory, a multiobjective optimization problem was converted into a trilevel maximizing decision-making (MDM) problem. The applied nested hybrid differential evolution was applied to solve the trilevel MDM problem using two nutrient media to identify anticancer targets in the genome-scale metabolic model of colorectal cancer, respectively. Using Dulbecco’s Modified Eagle Medium (DMEM), the computational results reveal that the identified anticancer targets were mostly involved in cholesterol biosynthesis, pyrimidine and purine metabolisms, glycerophospholipid biosynthetic pathway and sphingolipid pathway. However, using Ham’s medium, the genes involved in cholesterol biosynthesis were unidentifiable. A comparison of the uptake reactions for the DMEM and Ham’s medium revealed that no cholesterol uptake reaction was included in DMEM. Two additional media, i.e., a cholesterol uptake reaction was included in DMEM and excluded in HAM, were respectively used to investigate the relationship of tumor cell growth with nutrient components and anticancer target genes. The genes involved in the cholesterol biosynthesis were also revealed to be determinable if a cholesterol uptake reaction was not induced when the cells were in the culture medium. However, the genes involved in cholesterol biosynthesis became unidentifiable if such a reaction was induced.Keywords: Cancer metabolism, genome-scale metabolic model, constraint-based model, multilevel optimization, fuzzy optimization, hybrid differential evolution
Procedia PDF Downloads 804560 The Use of the Matlab Software as the Best Way to Recognize Penumbra Region in Radiotherapy
Authors: Alireza Shayegan, Morteza Amirabadi
Abstract:
The y tool was developed to quantitatively compare dose distributions, either measured or calculated. Before computing ɣ, the dose and distance scales of the two distributions, referred to as evaluated and reference, are re-normalized by dose and distance criteria, respectively. The re-normalization allows the dose distribution comparison to be conducted simultaneously along dose and distance axes. Several two-dimensional images were acquired using a Scanning Liquid Ionization Chamber EPID and Extended Dose Range (EDR2) films for regular and irregular radiation fields. The raw images were then converted into two-dimensional dose maps. Transitional and rotational manipulations were performed for images using Matlab software. As evaluated dose distribution maps, they were then compared with the corresponding original dose maps as the reference dose maps.Keywords: energetic electron, gamma function, penumbra, Matlab software
Procedia PDF Downloads 3014559 Empowering Certificate Management with Blockchain Technology
Authors: Yash Ambekar, Kapil Vhatkar, Prathamesh Swami, Kartikey Singh, Yashovardhan Kaware
Abstract:
The rise of online courses and certifications has created new opportunities for individuals to enhance their skills. However, this digital transformation has also given rise to coun- terfeit certificates. To address this multifaceted issue, we present a comprehensive certificate management system founded on blockchain technology and strengthened by smart contracts. Our system comprises three pivotal components: certificate generation, authenticity verification, and a user-centric digital locker for certificate storage. Blockchain technology underpins the entire system, ensuring the immutability and integrity of each certificate. The inclusion of a cryptographic hash for each certificate is a fundamental aspect of our design. Any alteration in the certificate’s data will yield a distinct hash, a powerful indicator of potential tampering. Furthermore, our system includes a secure digital locker based on cloud storage that empowers users to efficiently manage and access all their certificates in one place. Moreover, our project is committed to providing features for certificate revocation and updating, thereby enhancing the system’s flexibility and security. Hence, the blockchain and smart contract-based certificate management system offers a robust and one-stop solution to the escalating problem of counterfeit certificates in the digital era.Keywords: blockchain technology, smart contracts, counterfeit certificates, authenticity verification, cryptographic hash, digital locker
Procedia PDF Downloads 464558 A Multi-Cluster Enterprise Framework for Evolution of Knowledge System among Enterprises, Governments and Research Institutions
Authors: Sohail Ahmed, Ke Xing
Abstract:
This research theoretically explored the evolution mechanism of enterprise technological innovation capability system (ETICS) from the perspective of complex adaptive systems (CAS). Starting from CAS theory, this study proposed an analytical framework for ETICS, its concepts and theory by integrating CAS methodology into the management of technological innovation capability of enterprises and discusses how to use the principles of complexity to analyze the composition, evolution and realization of the technological innovation capabilities in complex dynamic environment. This paper introduces the concept and interaction of multi-agent, the theoretical background of CAS and summarizes the sources of technological innovation, the elements of each subject and the main clusters of adaptive interactions and innovation activities. The concept of multi-agents is applied through the linkages of enterprises, research institutions and government agencies with the leading enterprises in industrial settings. The study was exploratory based on CAS theory. Theoretical model is built by considering technological and innovation literature from foundational to state of the art projects of technological enterprises. On this basis, the theoretical model is developed to measure the evolution mechanism of enterprise technological innovation capability system. This paper concludes that the main characteristics for evolution in technological systems are based on enterprise’s research and development personal, investments in technological processes and innovation resources are responsible for the evolution of enterprise technological innovation performance. The research specifically enriched the application process of technological innovation in institutional networks related to enterprises.Keywords: complex adaptive system, echo model, enterprise knowledge system, research institutions, multi-agents.
Procedia PDF Downloads 694557 Stochastic Pi Calculus in Financial Markets: An Alternate Approach to High Frequency Trading
Authors: Jerome Joshi
Abstract:
The paper presents the modelling of financial markets using the Stochastic Pi Calculus model. The Stochastic Pi Calculus model is mainly used for biological applications; however, the feature of this model promotes its use in financial markets, more prominently in high frequency trading. The trading system can be broadly classified into exchange, market makers or intermediary traders and fundamental traders. The exchange is where the action of the trade is executed, and the two types of traders act as market participants in the exchange. High frequency trading, with its complex networks and numerous market participants (intermediary and fundamental traders) poses a difficulty while modelling. It involves the participants to seek the advantage of complex trading algorithms and high execution speeds to carry out large volumes of trades. To earn profits from each trade, the trader must be at the top of the order book quite frequently by executing or processing multiple trades simultaneously. This would require highly automated systems as well as the right sentiment to outperform other traders. However, always being at the top of the book is also not best for the trader, since it was the reason for the outbreak of the ‘Hot – Potato Effect,’ which in turn demands for a better and more efficient model. The characteristics of the model should be such that it should be flexible and have diverse applications. Therefore, a model which has its application in a similar field characterized by such difficulty should be chosen. It should also be flexible in its simulation so that it can be further extended and adapted for future research as well as be equipped with certain tools so that it can be perfectly used in the field of finance. In this case, the Stochastic Pi Calculus model seems to be an ideal fit for financial applications, owing to its expertise in the field of biology. It is an extension of the original Pi Calculus model and acts as a solution and an alternative to the previously flawed algorithm, provided the application of this model is further extended. This model would focus on solving the problem which led to the ‘Flash Crash’ which is the ‘Hot –Potato Effect.’ The model consists of small sub-systems, which can be integrated to form a large system. It is designed in way such that the behavior of ‘noise traders’ is considered as a random process or noise in the system. While modelling, to get a better understanding of the problem, a broader picture is taken into consideration with the trader, the system, and the market participants. The paper goes on to explain trading in exchanges, types of traders, high frequency trading, ‘Flash Crash,’ ‘Hot-Potato Effect,’ evaluation of orders and time delay in further detail. For the future, there is a need to focus on the calibration of the module so that they would interact perfectly with other modules. This model, with its application extended, would provide a basis for researchers for further research in the field of finance and computing.Keywords: concurrent computing, high frequency trading, financial markets, stochastic pi calculus
Procedia PDF Downloads 774556 Configuring Systems to Be Viable in a Crisis: The Role of Intuitive Decision-Making
Authors: Ayham Fattoum, Simos Chari, Duncan Shaw
Abstract:
Volatile, uncertain, complex, and ambiguous (VUCA) conditions threaten systems viability with emerging and novel events requiring immediate and localized responses. Such responsiveness is only possible through devolved freedom and emancipated decision-making. The Viable System Model (VSM) recognizes the need and suggests maximizing autonomy to localize decision-making and minimize residual complexity. However, exercising delegated autonomy in VUCA requires confidence and knowledge to use intuition and guidance to maintain systemic coherence. This paper explores the role of intuition as an enabler of emancipated decision-making and autonomy under VUCA. Intuition allows decision-makers to use their knowledge and experience to respond rapidly to novel events. This paper offers three contributions to VSM. First, it designs a system model that illustrates the role of intuitive decision-making in managing complexity and maintaining viability. Second, it takes a black-box approach to theory development in VSM to model the role of autonomy and intuition. Third, the study uses a multi-stage discovery-oriented approach (DOA) to develop theory, with each stage combining literature, data analysis, and model/theory development and identifying further questions for the subsequent stage. We synthesize literature (e.g., VSM, complexity management) with seven months of field-based insights (interviews, workshops, and observation of a live disaster exercise) to develop a framework of intuitive complexity management framework and VSM models. The results have practical implications for enhancing the resilience of organizations and communities.Keywords: Intuition, complexity management, decision-making, viable system model
Procedia PDF Downloads 674555 The Role of Privatization on the Formulation of Productive Supply Chain: The Case of Ethiopian Firms
Authors: Merhawit Fisseha Gebremariam, Yohannes Yebabe Tesfay
Abstract:
This study focuses on the formulation of a sustainable, effective, and efficient supply chain strategy framework that will enable Ethiopian privatized firms. The study examined the role of privatization in productive sourcing, production, and delivery to Ethiopian firm’s performances. To analyze our hypothesis, the authors applied the concepts of Key Performance Indicator (KPI), strategic outsourcing, purchasing portfolio analysis, and Porter's marketing analysis. The authors selected ten privatized companies and compared their financial, market expansion, and sustainability performances. The Chi-Square Test showed that at the 5% level of significance, privatization and outsourcing activities can assist the business performances of Ethiopian firms in terms of product promotion and new market expansion. At the 5% level of significance, the independent t-test result showed that firms that were privatized by Ethiopian investors showed stronger financial performance than those that were privatized by foreign investors. Furthermore, it is better if Ethiopian firms apply both cost leadership and differentiated strategy to enhance thriving in their business area. Ethiopian firms need to implement the supply chain operations reference (SCOR) model for an exclusive framework that supports communication links the supply chain partners, and enhances productivity. The government of Ethiopia should be aware that the privatization of firms by Ethiopian investors will strengthen the economy. Otherwise, the privatization process will be risky for the country, and therefore, the government of Ethiopia should stop doing those activities.Keywords: correlation analysis, market strategies, KPIs, privatization, risk and Ethiopia
Procedia PDF Downloads 684554 Language Development and Learning about Violence
Authors: Karen V. Lee
Abstract:
The background and significance of this study involves research about a music teacher discovering how language development and learning can help her overcome harmful and lasting consequences from sexual violence. Education about intervention resources from language development that helps her cope with consequences influencing her career as teacher. Basic methodology involves the qualitative method of research as theoretical framework where the author is drawn into a deep storied reflection about political issues surrounding teachers who need to overcome social, psychological, and health risk behaviors from violence. Sub-themes involve available education from learning resources to ensure teachers receive social, emotional, physical, spiritual, and intervention resources that evoke visceral, emotional responses from the audience. Major findings share how language development and learning provide helpful resources to victims of violence. It is hoped the research dramatizes an episodic yet incomplete story that highlights the circumstances surrounding the protagonist’s life. In conclusion, the research has a reflexive storied framework that embraces harmful and lasting consequences from sexual violence. The reflexive story of the sensory experience critically seeks verisimilitude by evoking lifelike and believable feelings from others. Thus, the scholarly importance of using language development and learning for intervention resources can provide transformative aspects that contribute to social change. Overall, the circumstance surrounding the story about sexual violence is not uncommon in society. Language development and learning supports the moral mission to help teachers overcome sexual violence that socially impacts their professional lives as victims.Keywords: intervention, language development and learning, sexual violence, story
Procedia PDF Downloads 3314553 Evaluation of Easy-to-Use Energy Building Design Tools for Solar Access Analysis in Urban Contexts: Comparison of Friendly Simulation Design Tools for Architectural Practice in the Early Design Stage
Abstract:
Current building sector is focused on reduction of energy requirements, on renewable energy generation and on regeneration of existing urban areas. These targets need to be solved with a systemic approach, considering several aspects simultaneously such as climate conditions, lighting conditions, solar radiation, PV potential, etc. The solar access analysis is an already known method to analyze the solar potentials, but in current years, simulation tools have provided more effective opportunities to perform this type of analysis, in particular in the early design stage. Nowadays, the study of the solar access is related to the easiness of the use of simulation tools, in rapid and easy way, during the design process. This study presents a comparison of three simulation tools, from the point of view of the user, with the aim to highlight differences in the easy-to-use of these tools. Using a real urban context as case study, three tools; Ecotect, Townscope and Heliodon, are tested, performing models and simulations and examining the capabilities and output results of solar access analysis. The evaluation of the ease-to-use of these tools is based on some detected parameters and features, such as the types of simulation, requirements of input data, types of results, etc. As a result, a framework is provided in which features and capabilities of each tool are shown. This framework shows the differences among these tools about functions, features and capabilities. The aim of this study is to support users and to improve the integration of simulation tools for solar access with the design process.Keywords: energy building design tools, solar access analysis, solar potential, urban planning
Procedia PDF Downloads 3404552 On the Existence of Homotopic Mapping Between Knowledge Graphs and Graph Embeddings
Authors: Jude K. Safo
Abstract:
Knowledge Graphs KG) and their relation to Graph Embeddings (GE) represent a unique data structure in the landscape of machine learning (relative to image, text and acoustic data). Unlike the latter, GEs are the only data structure sufficient for representing hierarchically dense, semantic information needed for use-cases like supply chain data and protein folding where the search space exceeds the limits traditional search methods (e.g. page-rank, Dijkstra, etc.). While GEs are effective for compressing low rank tensor data, at scale, they begin to introduce a new problem of ’data retreival’ which we observe in Large Language Models. Notable attempts by transE, TransR and other prominent industry standards have shown a peak performance just north of 57% on WN18 and FB15K benchmarks, insufficient practical industry applications. They’re also limited, in scope, to next node/link predictions. Traditional linear methods like Tucker, CP, PARAFAC and CANDECOMP quickly hit memory limits on tensors exceeding 6.4 million nodes. This paper outlines a topological framework for linear mapping between concepts in KG space and GE space that preserve cardinality. Most importantly we introduce a traceable framework for composing dense linguistic strcutures. We demonstrate performance on WN18 benchmark this model hits. This model does not rely on Large Langauge Models (LLM) though the applications are certainy relevant here as well.Keywords: representation theory, large language models, graph embeddings, applied algebraic topology, applied knot theory, combinatorics
Procedia PDF Downloads 684551 Educational Robotics with Easy Implementation and Low Cost
Authors: Maria R. A. R. Moreira, Francisco R. O. Da Silva, André O. A. Fontenele, Érick A. Ribeiro
Abstract:
This article deals with the influence of technology in education showing educational robotics as pedagogical method of solution for knowledge building. We are proposing the development and implementation of four robot models that can be used for teaching purposes involving the areas of mechatronics, mechanics, electronics and computing, making it efficient for learning other sciences and theories. One of the main reasons for application of the developed educational kits is its low cost, allowing its applicability to a greater number of educational institutions. The technology will add to education dissemination of knowledge by means of experiments in such a way that the pedagogical robotics promotes understanding, practice, solution and criticism about classroom challenges. We also present the relationship between education, science, technology and society through educational robotics, treated as an incentive to technological careers.Keywords: education, mecatronics, robotics, technology
Procedia PDF Downloads 3834550 Ontology-Driven Knowledge Discovery and Validation from Admission Databases: A Structural Causal Model Approach for Polytechnic Education in Nigeria
Authors: Bernard Igoche Igoche, Olumuyiwa Matthew, Peter Bednar, Alexander Gegov
Abstract:
This study presents an ontology-driven approach for knowledge discovery and validation from admission databases in Nigerian polytechnic institutions. The research aims to address the challenges of extracting meaningful insights from vast amounts of admission data and utilizing them for decision-making and process improvement. The proposed methodology combines the knowledge discovery in databases (KDD) process with a structural causal model (SCM) ontological framework. The admission database of Benue State Polytechnic Ugbokolo (Benpoly) is used as a case study. The KDD process is employed to mine and distill knowledge from the database, while the SCM ontology is designed to identify and validate the important features of the admission process. The SCM validation is performed using the conditional independence test (CIT) criteria, and an algorithm is developed to implement the validation process. The identified features are then used for machine learning (ML) modeling and prediction of admission status. The results demonstrate the adequacy of the SCM ontological framework in representing the admission process and the high predictive accuracies achieved by the ML models, with k-nearest neighbors (KNN) and support vector machine (SVM) achieving 92% accuracy. The study concludes that the proposed ontology-driven approach contributes to the advancement of educational data mining and provides a foundation for future research in this domain.Keywords: admission databases, educational data mining, machine learning, ontology-driven knowledge discovery, polytechnic education, structural causal model
Procedia PDF Downloads 644549 Interaction between Space Syntax and Agent-Based Approaches for Vehicle Volume Modelling
Authors: Chuan Yang, Jing Bie, Panagiotis Psimoulis, Zhong Wang
Abstract:
Modelling and understanding vehicle volume distribution over the urban network are essential for urban design and transport planning. The space syntax approach was widely applied as the main conceptual and methodological framework for contemporary vehicle volume models with the help of the statistical method of multiple regression analysis (MRA). However, the MRA model with space syntax variables shows a limitation in vehicle volume predicting in accounting for the crossed effect of the urban configurational characters and socio-economic factors. The aim of this paper is to construct models by interacting with the combined impact of the street network structure and socio-economic factors. In this paper, we present a multilevel linear (ML) and an agent-based (AB) vehicle volume model at an urban scale interacting with space syntax theoretical framework. The ML model allowed random effects of urban configurational characteristics in different urban contexts. And the AB model was developed with the incorporation of transformed space syntax components of the MRA models into the agents’ spatial behaviour. Three models were implemented in the same urban environment. The ML model exhibit superiority over the original MRA model in identifying the relative impacts of the configurational characters and macro-scale socio-economic factors that shape vehicle movement distribution over the city. Compared with the ML model, the suggested AB model represented the ability to estimate vehicle volume in the urban network considering the combined effects of configurational characters and land-use patterns at the street segment level.Keywords: space syntax, vehicle volume modeling, multilevel model, agent-based model
Procedia PDF Downloads 1454548 Analytical Description of Disordered Structures in Continuum Models of Pattern Formation
Authors: Gyula I. Tóth, Shaho Abdalla
Abstract:
Even though numerical simulations indeed have a significant precursory/supportive role in exploring the disordered phase displaying no long-range order in pattern formation models, studying the stability properties of this phase and determining the order of the ordered-disordered phase transition in these models necessitate an analytical description of the disordered phase. First, we will present the results of a comprehensive statistical analysis of a large number (1,000-10,000) of numerical simulations in the Swift-Hohenberg model, where the bulk disordered (or amorphous) phase is stable. We will show that the average free energy density (over configurations) converges, while the variance of the energy density vanishes with increasing system size in numerical simulations, which suggest that the disordered phase is a thermodynamic phase (i.e., its properties are independent of the configuration in the macroscopic limit). Furthermore, the structural analysis of this phase in the Fourier space suggests that the phase can be modeled by a colored isotropic Gaussian noise, where any instant of the noise describes a possible configuration. Based on these results, we developed the general mathematical framework of finding a pool of solutions to partial differential equations in the sense of continuous probability measure, which we will present briefly. Applying the general idea to the Swift-Hohenberg model we show, that the amorphous phase can be found, and its properties can be determined analytically. As the general mathematical framework is not restricted to continuum theories, we hope that the proposed methodology will open a new chapter in studying disordered phases.Keywords: fundamental theory, mathematical physics, continuum models, analytical description
Procedia PDF Downloads 134