Search results for: crisis computing
1555 WhatsApp as Part of a Blended Learning Model to Help Programming Novices
Authors: Tlou J. Ramabu
Abstract:
Programming is one of the challenging subjects in the field of computing. In the higher education sphere, some programming novices’ performance, retention rate, and success rate are not improving. Most of the time, the problem is caused by the slow pace of learning, difficulty in grasping the syntax of the programming language and poor logical skills. More importantly, programming forms part of major subjects within the field of computing. As a result, specialized pedagogical methods and innovation are highly recommended. Little research has been done on the potential productivity of the WhatsApp platform as part of a blended learning model. In this article, the authors discuss the WhatsApp group as a part of blended learning model incorporated for a group of programming novices. We discuss possible administrative activities for productive utilisation of the WhatsApp group on the blended learning overview. The aim is to take advantage of the popularity of WhatsApp and the time students spend on it for their educational purpose. We believe that blended learning featuring a WhatsApp group may ease novices’ cognitive load and strengthen their foundational programming knowledge and skills. This is a work in progress as the proposed blended learning model with WhatsApp incorporated is yet to be implemented.Keywords: blended learning, higher education, WhatsApp, programming, novices, lecturers
Procedia PDF Downloads 1721554 Reconceptualising Faculty Teaching Competence: The Role of Agency during the Pandemic
Authors: Ida Fatimawati Adi Badiozaman, Augustus Raymond Segar
Abstract:
The Covid-19 pandemic transformed teaching contexts at an unprecedented level. Although studies have focused mainly on its impact on students, little is known about how emergency online teaching affects faculty members in higher education. Given that the pandemic has robbed teachers of opportunities for adequate preparation, it is vital to understand how teaching competencies were perceived in the crisis-response transition to online teaching and learning (OTL). Therefore, the study explores how academics perceive their readiness for OTL and what competencies were perceived to be central. Therefore, through a mixed-methods design, the study first explores through a survey how academics perceive their readiness for OTL and what competencies were perceived to be central. Emerging trends from the quantitative data of 330 academics (three public and three private Higher learning institutions) led to the formulation of interview guides for the subsequent qualitative phase. The authors use critical sensemaking (CSM) to analyse interviews with twenty-two teachers (n = 22) (three public; three private HEs) toward understanding the interconnected layers of influences they draw from as they make sense of their teaching competence. The sensemaking process reframed competence and readiness in that agentic competency emerged as crucial in shaping resilience and adaptability during the transition to OTL. The findings also highlight professional learningcriticalto teacher competence: course design, communication, time management, technological competence, and identity (re)construction. The findings highlight opportunities for strategic orientation to change during crisis. Implications for pedagogy and policy are discussed.Keywords: online teaching, pedagogical competence, agentic competence, agency, technological competence
Procedia PDF Downloads 811553 Automated Facial Symmetry Assessment for Orthognathic Surgery: Utilizing 3D Contour Mapping and Hyperdimensional Computing-Based Machine Learning
Authors: Wen-Chung Chiang, Lun-Jou Lo, Hsiu-Hsia Lin
Abstract:
This study aimed to improve the evaluation of facial symmetry, which is crucial for planning and assessing outcomes in orthognathic surgery (OGS). Facial symmetry plays a key role in both aesthetic and functional aspects of OGS, making its accurate evaluation essential for optimal surgical results. To address the limitations of traditional methods, a different approach was developed, combining three-dimensional (3D) facial contour mapping with hyperdimensional (HD) computing to enhance precision and efficiency in symmetry assessments. The study was conducted at Chang Gung Memorial Hospital, where data were collected from 2018 to 2023 using 3D cone beam computed tomography (CBCT), a highly detailed imaging technique. A large and comprehensive dataset was compiled, consisting of 150 normal individuals and 2,800 patients, totaling 5,750 preoperative and postoperative facial images. These data were critical for training a machine learning model designed to analyze and quantify facial symmetry. The machine learning model was trained to process 3D contour data from the CBCT images, with HD computing employed to power the facial symmetry quantification system. This combination of technologies allowed for an objective and detailed analysis of facial features, surpassing the accuracy and reliability of traditional symmetry assessments, which often rely on subjective visual evaluations by clinicians. In addition to developing the system, the researchers conducted a retrospective review of 3D CBCT data from 300 patients who had undergone OGS. The patients’ facial images were analyzed both before and after surgery to assess the clinical utility of the proposed system. The results showed that the facial symmetry algorithm achieved an overall accuracy of 82.5%, indicating its robustness in real-world clinical applications. Postoperative analysis revealed a significant improvement in facial symmetry, with an average score increase of 51%. The mean symmetry score rose from 2.53 preoperatively to 3.89 postoperatively, demonstrating the system's effectiveness in quantifying improvements after OGS. These results underscore the system's potential for providing valuable feedback to surgeons and aiding in the refinement of surgical techniques. The study also led to the development of a web-based system that automates facial symmetry assessment. This system integrates HD computing and 3D contour mapping into a user-friendly platform that allows for rapid and accurate evaluations. Clinicians can easily access this system to perform detailed symmetry assessments, making it a practical tool for clinical settings. Additionally, the system facilitates better communication between clinicians and patients by providing objective, easy-to-understand symmetry scores, which can help patients visualize the expected outcomes of their surgery. In conclusion, this study introduced a valuable and highly effective approach to facial symmetry evaluation in OGS, combining 3D contour mapping, HD computing, and machine learning. The resulting system achieved high accuracy and offers a streamlined, automated solution for clinical use. The development of the web-based platform further enhances its practicality, making it a valuable tool for improving surgical outcomes and patient satisfaction in orthognathic surgery.Keywords: facial symmetry, orthognathic surgery, facial contour mapping, hyperdimensional computing
Procedia PDF Downloads 261552 Screen Method of Distributed Cooperative Navigation Factors for Unmanned Aerial Vehicle Swarm
Authors: Can Zhang, Qun Li, Yonglin Lei, Zhi Zhu, Dong Guo
Abstract:
Aiming at the problem of factor screen in distributed collaborative navigation of dense UAV swarm, an efficient distributed collaborative navigation factor screen method is proposed. The method considered the balance between computing load and positioning accuracy. The proposed algorithm utilized the factor graph model to implement a distributed collaborative navigation algorithm. The GNSS information of the UAV itself and the ranging information between the UAVs are used as the positioning factors. In this distributed scheme, a local factor graph is established for each UAV. The positioning factors of nodes with good geometric position distribution and small variance are selected to participate in the navigation calculation. To demonstrate and verify the proposed methods, the simulation and experiments in different scenarios are performed in this research. Simulation results show that the proposed scheme achieves a good balance between the computing load and positioning accuracy in the distributed cooperative navigation calculation of UAV swarm. This proposed algorithm has important theoretical and practical value for both industry and academic areas.Keywords: screen method, cooperative positioning system, UAV swarm, factor graph, cooperative navigation
Procedia PDF Downloads 791551 The Fifth Political Theory and Countering Terrorism in the Post 9/11 Era
Authors: Rana Eijaz Ahmad
Abstract:
This paper is going to explain about the Fifth Political Theory that challenges all existing three plus one (Capitalism, Marxism and Fascism + Fourth Political Theory) theories. It says, ‘it is human ambiance evolve any political system to survive instead of borrowing other imported thoughts to live in a specific environment, in which Legitimacy leads to authority and promotes humanism.’ According to this theory, no other state is allowed to dictate or install any political system upon other states. It is the born right of individuals to choose a political system or a set of values that are going to make their structures and functions efficient enough to support the system harmony and counter the negative forces successfully. In the post 9/11 era, it is observed that all existing theories like Capitalism, Marxism, Fascism and Fourth Political Theory remained unsuccessful in resolving the global crisis. The so-called war against terrorism is proved as a war for terrorism and creates a vacuum on the global stage, worsening the crisis. The fifth political theory is an answer to counter terrorism in the twenty-first century. It calls for accountability of the United Nations for its failure in sustaining peace at global level. Therefore, the UN charter is supposed to be implemented in its true letter and spirit. All independent sovereign states have right to evolve their own system to carry out a political system that suits them best for sustaining harmony at home. This is the only way to counter terrorism. This paper is comprised of mixed method. Qualitative, quantitative and comparative methods will be used along with secondary sources. The objective of this paper is to create knowledge for the benefit of human beings with a logical and rational argument. It will help political scientists and scholars in conflict management and countering terrorism on pragmatic grounds.Keywords: capitalism, fourth political theory, fifth political theory, Marxism, fascism
Procedia PDF Downloads 3801550 Fault Tolerant and Testable Designs of Reversible Sequential Building Blocks
Authors: Vishal Pareek, Shubham Gupta, Sushil Chandra Jain
Abstract:
With increasing high-speed computation demand the power consumption, heat dissipation and chip size issues are posing challenges for logic design with conventional technologies. Recovery of bit loss and bit errors is other issues that require reversibility and fault tolerance in the computation. The reversible computing is emerging as an alternative to conventional technologies to overcome the above problems and helpful in a diverse area such as low-power design, nanotechnology, quantum computing. Bit loss issue can be solved through unique input-output mapping which require reversibility and bit error issue require the capability of fault tolerance in design. In order to incorporate reversibility a number of combinational reversible logic based circuits have been developed. However, very few sequential reversible circuits have been reported in the literature. To make the circuit fault tolerant, a number of fault model and test approaches have been proposed for reversible logic. In this paper, we have attempted to incorporate fault tolerance in sequential reversible building blocks such as D flip-flop, T flip-flop, JK flip-flop, R-S flip-flop, Master-Slave D flip-flop, and double edge triggered D flip-flop by making them parity preserving. The importance of this proposed work lies in the fact that it provides the design of reversible sequential circuits completely testable for any stuck-at fault and single bit fault. In our opinion our design of reversible building blocks is superior to existing designs in term of quantum cost, hardware complexity, constant input, garbage output, number of gates and design of online testable D flip-flop have been proposed for the first time. We hope our work can be extended for building complex reversible sequential circuits.Keywords: parity preserving gate, quantum computing, fault tolerance, flip-flop, sequential reversible logic
Procedia PDF Downloads 5451549 Imbalance on the Croatian Housing Market in the Aftermath of an Economic Crisis
Authors: Tamara Slišković, Tomislav Sekur
Abstract:
This manuscript examines factors that affect demand and supply of the housing market in Croatia. The period from the beginning of this century, until 2008, was characterized by a strong expansion of construction, housing and real estate market in general. Demand for residential units was expanding, and this was supported by favorable lending conditions of banks. Indicators on the supply side, such as the number of newly built houses and the construction volume index were also increasing. Rapid growth of demand, along with the somewhat slower supply growth, led to the situation in which new apartments were sold before the completion of residential buildings. This resulted in a rise of housing price which was indication of a clear link between the housing prices with the supply and demand in the housing market. However, after 2008 general economic conditions in Croatia worsened and demand for housing has fallen dramatically, while supply descended at much slower pace. Given that there is a gap between supply and demand, it can be concluded that the housing market in Croatia is in imbalance. Such trend is accompanied by a relatively small decrease in housing price. The final result of such movements is the large number of unsold housing units at relatively high price levels. For this reason, it can be argued that housing prices are sticky and that, consequently, the price level in the aftermath of a crisis does not correspond to the discrepancy between supply and demand on the Croatian housing market. The degree of rigidity of the housing price can be determined by inclusion of the housing price as the explanatory variable in the housing demand function. Other independent variables are demographic variable (e.g. the number of households), the interest rate on housing loans, households' disposable income and rent. The equilibrium price is reached when the demand for housing equals its supply, and the speed of adjustment of actual prices to equilibrium prices reveals the extent to which the prices are rigid. The latter requires inclusion of the housing prices with time lag as an independent variable in estimating demand function. We also observe the supply side of the housing market, in order to explain to what extent housing prices explain the movement of new construction activity, and other variables that describe the supply. In this context, we test whether new construction on the Croatian market is dependent on current prices or prices with a time lag. Number of dwellings is used to approximate new construction (flow variable), while the housing prices (current or lagged), quantity of dwellings in the previous period (stock variable) and a series of costs related to new construction are independent variables. We conclude that the key reason for the imbalance in the Croatian housing market should be sought in the relative relationship of price elasticities of supply and demand.Keywords: Croatian housing market, economic crisis, housing prices, supply imbalance, demand imbalance
Procedia PDF Downloads 2711548 Distributed Processing for Content Based Lecture Video Retrieval on Hadoop Framework
Authors: U. S. N. Raju, Kothuri Sai Kiran, Meena G. Kamal, Vinay Nikhil Pabba, Suresh Kanaparthi
Abstract:
There is huge amount of lecture video data available for public use, and many more lecture videos are being created and uploaded every day. Searching for videos on required topics from this huge database is a challenging task. Therefore, an efficient method for video retrieval is needed. An approach for automated video indexing and video search in large lecture video archives is presented. As the amount of video lecture data is huge, it is very inefficient to do the processing in a centralized computation framework. Hence, Hadoop Framework for distributed computing for Big Video Data is used. First, step in the process is automatic video segmentation and key-frame detection to offer a visual guideline for the video content navigation. In the next step, we extract textual metadata by applying video Optical Character Recognition (OCR) technology on key-frames. The OCR and detected slide text line types are adopted for keyword extraction, by which both video- and segment-level keywords are extracted for content-based video browsing and search. The performance of the indexing process can be improved for a large database by using distributed computing on Hadoop framework.Keywords: video lectures, big video data, video retrieval, hadoop
Procedia PDF Downloads 5331547 Ifrs Adoption, Enforcement, and the Value Relevant of Accounting Amounts: The Particular Case of South Africa
Authors: Edward Chamisa, Colin C. Smith, Hamutyinei H. Pamburai, Abdul C. Abdulla
Abstract:
South Africa (SA) adopted International Financial Reporting Standards (IFRS) for listed firms effective 1 January 2005. However, it was not until 2011 that substantial financial reporting enforcement changes were introduced, which were meant to ensure compliance with IFRS. This innovative setting allows us to examine the value relevance of accounting amounts during the (1) pre-IFRS adoption period (2002-2004); (2) post-IFRS adoption, but pre-enforcement changes period (2006-2010); and (3) post-enforcement changes period (2011-2012). The results show that accounting amounts were most value relevant in the post-enforcement changes period (R2, 75.5%) compared to both the pre-IFRS adoption period (adjusted R2 is 24.3%) and the period after IFRS adoption but before enforcement changes (adjusted R2 is 37.5%). Also, during the 2008 financial crisis, the equity book value per share was significantly value relevant (at 1%) but not earnings per share, whereas before the crisis, the opposite was true. We make two important contributions to the literature. First, we identify SA as an innovative setting that allows researchers to examine separately the effects of IFRS adoption and enforcement changes on capital markets and accounting quality. This is a departure from prior studies that are dominated by the European Union setting, where IFRS adoption occurred contemporaneously with enforcement and other regulatory changes. Second, we provide preliminary findings which suggest that while the adoption of IFRS seems to have improved the financial reporting quality of accounting amounts of SA listed firms, its impact appears to be limited unless combined with effective enforcement.Keywords: international financial reporting standards (ifrs), ifrs adoption, financial reporting enforcement, value relevance, price model, equity book value, earnings per share
Procedia PDF Downloads 701546 A Real-World Roadmap and Exploration of Quantum Computers Capacity to Trivialise Internet Security
Authors: James Andrew Fitzjohn
Abstract:
This paper intends to discuss and explore the practical aspects of cracking encrypted messages with quantum computers. The theory of this process has been shown and well described both in academic papers and headline-grabbing news articles, but with all theory and hyperbole, we must be careful to assess the practicalities of these claims. Therefore, we will use real-world devices and proof of concept code to prove or disprove the notion that quantum computers will render the encryption technologies used by many websites unfit for purpose. It is time to discuss and implement the practical aspects of the process as many advances in quantum computing hardware/software have recently been made. This paper will set expectations regarding the useful lifespan of RSA and cipher lengths and propose alternative encryption technologies. We will set out comprehensive roadmaps describing when and how encryption schemes can be used, including when they can no longer be trusted. The cost will also be factored into our investigation; for example, it would make little financial sense to spend millions of dollars on a quantum computer to factor a private key in seconds when a commodity GPU could perform the same task in hours. It is hoped that the real-world results depicted in this paper will help influence the owners of websites who can take appropriate actions to improve the security of their provisions.Keywords: quantum computing, encryption, RSA, roadmap, real world
Procedia PDF Downloads 1311545 Planning and Management Options for Pastoral Resource: Case of Mecheria Region, Algeria
Authors: Driss Haddouche
Abstract:
Pastoral crisis in Algeria has its origins in rangeland degradation which are the main factor in any activity in the steppe zones. Indeed, faced with the increasing human and animal population on a living space smaller and smaller, there is an overuse of what remains of the steppe range lands, consequently the not sustainability of biomass production. Knowing the amount of biomass available, the practice of grazing options, taking into account the use of "Use Factor" factor remains an essential method for managing pastoral resources. This factor has three options: at 40% Conservative pasture; at 60 % the beginning of overgrazing; at 80% destructive grazing. Accessibility on the pasture is based on our field observations of a type any flock along a grazing cycle. The main purpose of these observations is to highlight the speed of herd grazing situation. Several individuals from the herd were timed to arrive at an average duration of about 5 seconds to move between two tufts of grass, separated by a distance of one meter. This gives a rate of 5 s/m (0.72 km/h) flat. This speed varies depending on the angle of the slope. Knowing the speed and slope of each pixel of the study area, given by the digital elevation model of Spot Image (MNE) and whose pitch is 15 meters, a map of pasture according to the distances is generated. Knowing the stocking and biomass available, the examination of the common Mécheria at regular distances (8.64 km or 12 hours of grazing, 17.28 km or 24 hours of grazing and 25.92 Km or 36 hours of grazing), offers three different options (conservation grazing resource: utilization at 40%; overgrazing statements for use at 60% and grazing destructive for use by more than 80%) for each distance traveled by sheep from the starting point is the town of Mécheria.Keywords: pastoral crisis, biomass, animal charge, use factor, Algeria
Procedia PDF Downloads 5311544 GPU Accelerated Fractal Image Compression for Medical Imaging in Parallel Computing Platform
Authors: Md. Enamul Haque, Abdullah Al Kaisan, Mahmudur R. Saniat, Aminur Rahman
Abstract:
In this paper, we have implemented both sequential and parallel version of fractal image compression algorithms using CUDA (Compute Unified Device Architecture) programming model for parallelizing the program in Graphics Processing Unit for medical images, as they are highly similar within the image itself. There is several improvements in the implementation of the algorithm as well. Fractal image compression is based on the self similarity of an image, meaning an image having similarity in majority of the regions. We take this opportunity to implement the compression algorithm and monitor the effect of it using both parallel and sequential implementation. Fractal compression has the property of high compression rate and the dimensionless scheme. Compression scheme for fractal image is of two kinds, one is encoding and another is decoding. Encoding is very much computational expensive. On the other hand decoding is less computational. The application of fractal compression to medical images would allow obtaining much higher compression ratios. While the fractal magnification an inseparable feature of the fractal compression would be very useful in presenting the reconstructed image in a highly readable form. However, like all irreversible methods, the fractal compression is connected with the problem of information loss, which is especially troublesome in the medical imaging. A very time consuming encoding process, which can last even several hours, is another bothersome drawback of the fractal compression.Keywords: accelerated GPU, CUDA, parallel computing, fractal image compression
Procedia PDF Downloads 3351543 Strengthening Social and Psychological Resources - Project "Herausforderung" as a (Sports-) Pedagogical Concept in Adolescence
Authors: Kristof Grätz
Abstract:
Background: Coping with crisis situations (e.g., the identity crisis in adolescence) is omnipresent in today's socialization and should be encouraged as a child. For this reason, students should be given the opportunity to create, endure and manage these crisis situations in a sporting context within the project “Herausforderung.” They should prove themselves by working on a self-assigned task, accompanied by ‚coaches’ in a place outside of their hometown. The aim of the project is to observe this process from a resource-oriented perspective. Health promotion, as called for by the WHO in the Ottawa Charter since 1986, includes strengthening psychosocial resources. These include cognitive, emotional, and social potentials that contribute to improving the quality of life, provide favourable conditions for coping with health burdens and enable people to influence their physical performance and well-being self-confidently and actively. A systematic strengthening of psychosocial resources leads to an improvement in mental health and contributes decisively to the regular implementation and long-term maintenance of this health behavior. Previous studies have already shown significant increases in self-concept following experiential educational measures [Fengler, 2007; Eberle & Fengler, 2018] and positive effects of experience-based school trips on the social competence of students [Reuker, 2009]. Method: The research project examines the influence of the project “Herausforderung” on psychosocial resources such as self-efficacy, self-concept, social support, and group cohesion. The students participating in the project will be tested in a pre-post design in the context of the challenge. This test includes specific questions to capture the different psychosocial resources. For the measurement, modifications of existing scales with good item selectivity and reliability are used to a large extent, so that acceptable item and scale values can be expected. If necessary, the scales were adapted or shortened to the specific context in order to ensure a balanced relationship between reliability and test economy. Specifically, these are already tested scales such as FRKJ 8-16, FSKN, GEQ, and F-SozU. The aim is to achieve a sample size of n ≥ 100. Conclusion: The project will be reviewed with regard to its effectiveness, and implications for a resource-enhancing application in sports settings will be given. Conclusions are drawn as to which extent to specific experiential educational content in physical education can have a health-promoting effect on the participants.Keywords: children, education, health promotion, psychosocial resources
Procedia PDF Downloads 1461542 Africa and the Gas Supply Crisis to European Countries under the Russian-Ukrainian War: A Study on the Nigerian-Algerian Gas Pipeline project Importance
Authors: Mohammed Lamine Benaouda
Abstract:
This paper seeks to shed light on the African continent role with the crisis of natural gas supplies to European countries, which resulted from the repercussions of the Russian-Ukrainian war, by examining the case of re-launching the Trans-Saharan Gas Pipeline project Nigeria-Algeria, and clarifying the strategic importance This project is mutually beneficial in the long run. The paper relied on the analytical and statistical method in order to find out the the impact that the project represents on the huge needs of the European gas market on the one hand, and monitoring the various economic gains for Algeria and Nigeria on the other hand, in addition, the comparative approach to assess the possible effects of the success and feasibility of the project economy for all its beneficiaries. The paper founds that the complexity has multiplied in the global energy market in general and the European one in particular, following what the world witnessed from the repercussions of the Russian-Ukrainian war, as well as the extreme importance of the poles of African countries in the arena of the international struggle over resources, which allows them a margin From maneuvering and regional and global influence in various fields. With regard to the research outcoms and the future scope, the researcher believes that the African continent, in light of international competition and conflict, as well as what the world is witnessing in terms of restoring balances of power in the current international system, will play very important roles, especially with its enormous natural and human capabilities, which enable it to Weighting future conflicts over energy and spheres of influence.Keywords: algeria, nigeria, west africa, ECOWAS, gas supplies, russia, ukrain
Procedia PDF Downloads 801541 Role of Kerala’s Diaspora Philanthropy Engagement During Economic Crises
Authors: Shibinu S, Mohamed Haseeb N
Abstract:
In times of crisis, the diaspora's role and the help it offers are seen to be vital in determining how many countries, particularly low- and middle-income nations that significantly rely on remittances, recover. Twenty-one lakh twenty thousand Keralites have emigrated abroad, with 81.2 percent of these outflows occurring in the Gulf Cooperative Council (GCC). Most of them are semi-skilled or low-skilled laborers employed in GCC nations. Additionally, a sizeable portion of migrants are employed in industrialized nations like the UK and the US. These nations have seen the development of a highly robust Indian Diaspora. India's development is largely dependent on the generosity of its diaspora, and the nation has benefited greatly from the substantial contributions made by several emigrant generations. Its strength was noticeable during the COVID-19 and Kerala floods. Millions of people were displaced, millions of properties were damaged, and many people died as a result of the 2018 Kerala floods. The Malayalee diaspora played a crucial role in the reconstruction of Kerala by providing support for the rescue efforts underway on the ground through their extensive worldwide network. During COVID-19, an analogous outreach was also noted, in which the diaspora assisted stranded migrants across the globe. Together with the work the diaspora has done for the state's development and recovery, there has also been a recent outpouring of assistance during the COVID-19 pandemic. The study focuses on the subtleties of diaspora philanthropic scholarship and how Kerala was able to recover from the COVID-19 pandemic and floods thanks to it. Semi-structured in-depth interviews with migrants, migrant organizations, and beneficiaries from the diaspora through snowball sampling to better understand the role that diaspora philanthropy plays in times of crisis.Keywords: crises, diaspora, remittances, COVID-19, flood, economic development of Kerala
Procedia PDF Downloads 311540 Roasting Process of Sesame Seeds Modelling Using Gene Expression Programming: A Comparative Analysis with Response Surface Methodology
Authors: Alime Cengiz, Talip Kahyaoglu
Abstract:
Roasting process has the major importance to obtain desired aromatic taste of nuts. In this study, two kinds of roasting process were applied to hulled sesame seeds - vacuum oven and hot air roasting. Efficiency of Gene Expression Programming (GEP), a new soft computing technique of evolutionary algorithm that describes the cause and effect relationships in the data modelling system, and response surface methodology (RSM) were examined in the modelling of roasting processes over a range of temperature (120-180°C) for various times (30-60 min). Color attributes (L*, a*, b*, Browning Index (BI)), textural properties (hardness and fracturability) and moisture content were evaluated and modelled by RSM and GEP. The GEP-based formulations and RSM approach were compared with experimental results and evaluated according to correlation coefficients. The results showed that both GEP and RSM were found to be able to adequately learn the relation between roasting conditions and physical and textural parameters of roasted seeds. However, GEP had better prediction performance than the RSM with the high correlation coefficients (R2 >0.92) for the all quality parameters. This result indicates that the soft computing techniques have better capability for describing the physical changes occuring in sesame seeds during roasting process.Keywords: genetic expression programming, response surface methodology, roasting, sesame seed
Procedia PDF Downloads 4181539 Development of Geo-computational Model for Analysis of Lassa Fever Dynamics and Lassa Fever Outbreak Prediction
Authors: Adekunle Taiwo Adenike, I. K. Ogundoyin
Abstract:
Lassa fever is a neglected tropical virus that has become a significant public health issue in Nigeria, with the country having the greatest burden in Africa. This paper presents a Geo-Computational Model for Analysis and Prediction of Lassa Fever Dynamics and Outbreaks in Nigeria. The model investigates the dynamics of the virus with respect to environmental factors and human populations. It confirms the role of the rodent host in virus transmission and identifies how climate and human population are affected. The proposed methodology is carried out on a Linux operating system using the OSGeoLive virtual machine for geographical computing, which serves as a base for spatial ecology computing. The model design uses Unified Modeling Language (UML), and the performance evaluation uses machine learning algorithms such as random forest, fuzzy logic, and neural networks. The study aims to contribute to the control of Lassa fever, which is achievable through the combined efforts of public health professionals and geocomputational and machine learning tools. The research findings will potentially be more readily accepted and utilized by decision-makers for the attainment of Lassa fever elimination.Keywords: geo-computational model, lassa fever dynamics, lassa fever, outbreak prediction, nigeria
Procedia PDF Downloads 931538 Improved Multi–Objective Firefly Algorithms to Find Optimal Golomb Ruler Sequences for Optimal Golomb Ruler Channel Allocation
Authors: Shonak Bansal, Prince Jain, Arun Kumar Singh, Neena Gupta
Abstract:
Recently nature–inspired algorithms have widespread use throughout the tough and time consuming multi–objective scientific and engineering design optimization problems. In this paper, we present extended forms of firefly algorithm to find optimal Golomb ruler (OGR) sequences. The OGRs have their one of the major application as unequally spaced channel–allocation algorithm in optical wavelength division multiplexing (WDM) systems in order to minimize the adverse four–wave mixing (FWM) crosstalk effect. The simulation results conclude that the proposed optimization algorithm has superior performance compared to the existing conventional computing and nature–inspired optimization algorithms to find OGRs in terms of ruler length, total optical channel bandwidth and computation time.Keywords: channel allocation, conventional computing, four–wave mixing, nature–inspired algorithm, optimal Golomb ruler, lévy flight distribution, optimization, improved multi–objective firefly algorithms, Pareto optimal
Procedia PDF Downloads 3201537 DYVELOP Method Implementation for the Research Development in Small and Middle Enterprises
Authors: Jiří F. Urbánek, David Král
Abstract:
Small and Middle Enterprises (SME) have a specific mission, characteristics, and behavior in global business competitive environments. They must respect policy, rules, requirements and standards in all their inherent and outer processes of supply - customer chains and networks. Paper aims and purposes are to introduce computational assistance, which enables us the using of prevailing operation system MS Office (SmartArt...) for mathematical models, using DYVELOP (Dynamic Vector Logistics of Processes) method. It is providing for SMS´s global environment the capability and profit to achieve its commitment regarding the effectiveness of the quality management system in customer requirements meeting and also the continual improvement of the organization’s and SME´s processes overall performance and efficiency, as well as its societal security via continual planning improvement. DYVELOP model´s maps - the Blazons are able mathematically - graphically express the relationships among entities, actors, and processes, including the discovering and modeling of the cycling cases and their phases. The blazons need live PowerPoint presentation for better comprehension of this paper mission – added value analysis. The crisis management of SMEs is obliged to use the cycles for successful coping of crisis situations. Several times cycling of these cases is a necessary condition for the encompassment of the both the emergency event and the mitigation of organization´s damages. Uninterrupted and continuous cycling process is a good indicator and controlling actor of SME continuity and its sustainable development advanced possibilities.Keywords: blazons, computational assistance, DYVELOP method, small and middle enterprises
Procedia PDF Downloads 3401536 A Regional Analysis on Co-movement of Sovereign Credit Risk and Interbank Risks
Authors: Mehdi Janbaz
Abstract:
The global financial crisis and the credit crunch that followed magnified the importance of credit risk management and its crucial role in the stability of all financial sectors and the whole of the system. Many believe that risks faced by the sovereign sector are highly interconnected with banking risks and most likely to trigger and reinforce each other. This study aims to examine (1) the impact of banking and interbank risk factors on the sovereign credit risk of Eurozone, and (2) how the EU Credit Default Swaps spreads dynamics are affected by the Crude Oil price fluctuations. The hypothesizes are tested by employing fitting risk measures and through a four-staged linear modeling approach. The sovereign senior 5-year Credit Default Swap spreads are used as a core measure of the credit risk. The monthly time-series data of the variables used in the study are gathered from the DataStream database for a period of 2008-2019. First, a linear model test the impact of regional macroeconomic and market-based factors (STOXX, VSTOXX, Oil, Sovereign Debt, and Slope) on the CDS spreads dynamics. Second, the bank-specific factors, including LIBOR-OIS spread (the difference between the Euro 3-month LIBOR rate and Euro 3-month overnight index swap rates) and Euribor, are added to the most significant factors of the previous model. Third, the global financial factors including EURO to USD Foreign Exchange Volatility, TED spread (the difference between 3-month T-bill and the 3-month LIBOR rate based in US dollars), and Chicago Board Options Exchange (CBOE) Crude Oil Volatility Index are added to the major significant factors of the first two models. Finally, a model is generated by a combination of the major factor of each variable set in addition to the crisis dummy. The findings show that (1) the explanatory power of LIBOR-OIS on the sovereign CDS spread of Eurozone is very significant, and (2) there is a meaningful adverse co-movement between the Crude Oil price and CDS price of Eurozone. Surprisingly, adding TED spread (the difference between the three-month Treasury bill and the three-month LIBOR based in US dollars.) to the analysis and beside the LIBOR-OIS spread (the difference between the Euro 3M LIBOR and Euro 3M OIS) in third and fourth models has been increased the predicting power of LIBOR-OIS. Based on the results, LIBOR-OIS, Stoxx, TED spread, Slope, Oil price, OVX, FX volatility, and Euribor are the determinants of CDS spreads dynamics in Eurozone. Moreover, the positive impact of the crisis period on the creditworthiness of the Eurozone is meaningful.Keywords: CDS, crude oil, interbank risk, LIBOR-OIS, OVX, sovereign credit risk, TED
Procedia PDF Downloads 1441535 The Impact of Artificial Intelligence on Food Nutrition
Authors: Antonyous Fawzy Boshra Girgis
Abstract:
Nutrition labels are diet-related health policies. They help individuals improve food-choice decisions and reduce intake of calories and unhealthy food elements, like cholesterol. However, many individuals do not pay attention to nutrition labels or fail to appropriately understand them. According to the literature, thinking and cognitive styles can have significant effects on attention to nutrition labels. According to the author's knowledge, the effect of global/local processing on attention to nutrition labels has not been previously studied. Global/local processing encourages individuals to attend to the whole/specific parts of an object and can have a significant impact on people's visual attention. In this study, this effect was examined with an experimental design using the eye-tracking technique. The research hypothesis was that individuals with local processing would pay more attention to nutrition labels, including nutrition tables and traffic lights. An experiment was designed with two conditions: global and local information processing. Forty participants were randomly assigned to either global or local conditions, and their processing style was manipulated accordingly. Results supported the hypothesis for nutrition tables but not for traffic lights.Keywords: nutrition, public health, SA Harvest, foodeye-tracking, nutrition labelling, global/local information processing, individual differencesmobile computing, cloud computing, nutrition label use, nutrition management, barcode scanning
Procedia PDF Downloads 401534 Creating Energy Sustainability in an Enterprise
Authors: John Lamb, Robert Epstein, Vasundhara L. Bhupathi, Sanjeev Kumar Marimekala
Abstract:
As we enter the new era of Artificial Intelligence (AI) and Cloud Computing, we mostly rely on the Machine and Natural Language Processing capabilities of AI, and Energy Efficient Hardware and Software Devices in almost every industry sector. In these industry sectors, much emphasis is on developing new and innovative methods for producing and conserving energy and sustaining the depletion of natural resources. The core pillars of sustainability are economic, environmental, and social, which is also informally referred to as the 3 P's (People, Planet and Profits). The 3 P's play a vital role in creating a core Sustainability Model in the Enterprise. Natural resources are continually being depleted, so there is more focus and growing demand for renewable energy. With this growing demand, there is also a growing concern in many industries on how to reduce carbon emissions and conserve natural resources while adopting sustainability in corporate business models and policies. In our paper, we would like to discuss the driving forces such as Climate changes, Natural Disasters, Pandemic, Disruptive Technologies, Corporate Policies, Scaled Business Models and Emerging social media and AI platforms that influence the 3 main pillars of Sustainability (3P’s). Through this paper, we would like to bring an overall perspective on enterprise strategies and the primary focus on bringing cultural shifts in adapting energy-efficient operational models. Overall, many industries across the globe are incorporating core sustainability principles such as reducing energy costs, reducing greenhouse gas (GHG) emissions, reducing waste and increasing recycling, adopting advanced monitoring and metering infrastructure, reducing server footprint and compute resources (Shared IT services, Cloud computing, and Application Modernization) with the vision for a sustainable environment.Keywords: climate change, pandemic, disruptive technology, government policies, business model, machine learning and natural language processing, AI, social media platform, cloud computing, advanced monitoring, metering infrastructure
Procedia PDF Downloads 1111533 e-Learning Security: A Distributed Incident Response Generator
Authors: Bel G Raggad
Abstract:
An e-Learning setting is a distributed computing environment where information resources can be connected to any public network. Public networks are very unsecure which can compromise the reliability of an e-Learning environment. This study is only concerned with the intrusion detection aspect of e-Learning security and how incident responses are planned. The literature reported great advances in intrusion detection system (ids) but neglected to study an important ids weakness: suspected events are detected but an intrusion is not determined because it is not defined in ids databases. We propose an incident response generator (DIRG) that produces incident responses when the working ids system suspects an event that does not correspond to a known intrusion. Data involved in intrusion detection when ample uncertainty is present is often not suitable to formal statistical models including Bayesian. We instead adopt Dempster and Shafer theory to process intrusion data for the unknown event. The DIRG engine transforms data into a belief structure using incident scenarios deduced by the security administrator. Belief values associated with various incident scenarios are then derived and evaluated to choose the most appropriate scenario for which an automatic incident response is generated. This article provides a numerical example demonstrating the working of the DIRG system.Keywords: decision support system, distributed computing, e-Learning security, incident response, intrusion detection, security risk, statefull inspection
Procedia PDF Downloads 4371532 Innovation in PhD Training in the Interdisciplinary Research Institute
Authors: B. Shaw, K. Doherty
Abstract:
The Cultural Communication and Computing Research Institute (C3RI) is a diverse multidisciplinary research institute including art, design, media production, communication studies, computing and engineering. Across these disciplines it can seem like there are enormous differences of research practice and convention, including differing positions on objectivity and subjectivity, certainty and evidence, and different political and ethical parameters. These differences sit within, often unacknowledged, histories, codes, and communication styles of specific disciplines, and it is all these aspects that can make understanding of research practice across disciplines difficult. To explore this, a one day event was orchestrated, testing how a PhD community might communicate and share research in progress in a multi-disciplinary context. Instead of presenting results at a conference, research students were tasked to articulate their method of inquiry. A working party of students from across disciplines had to design a conference call, visual identity and an event framework that would work for students across all disciplines. The process of establishing the shape and identity of the conference was revealing. Even finding a linguistic frame that would meet the expectations of different disciplines for the conference call was challenging. The first abstracts submitted either resorted to reporting findings, or only described method briefly. It took several weeks of supported intervention for research students to get ‘inside’ their method and to understand their research practice as a process rich with philosophical and practical decisions and implications. In response to the abstracts the conference committee generated key methodological categories for conference sessions, including sampling, capturing ‘experience’, ‘making models’, researcher identities, and ‘constructing data’. Each session involved presentations by visual artists, communications students and computing researchers with inter-disciplinary dialogue, facilitated by alumni Chairs. The apparently simple focus on method illuminated research process as a site of creativity, innovation and discovery, and also built epistemological awareness, drawing attention to what is being researched and how it can be known. It was surprisingly difficult to limit students to discussing method, and it was apparent that the vocabulary available for method is sometimes limited. However, by focusing on method rather than results, the genuine process of research, rather than one constructed for approval, could be captured. In unlocking the twists and turns of planning and implementing research, and the impact of circumstance and contingency, students had to reflect frankly on successes and failures. This level of self – and public- critique emphasised the degree of critical thinking and rigour required in executing research and demonstrated that honest reportage of research, faults and all, is good valid research. The process also revealed the degree that disciplines can learn from each other- the computing students gained insights from the sensitive social contextualizing generated by communications and art and design students, and art and design students gained understanding from the greater ‘distance’ and emphasis on application that computing students applied to their subjects. Finding the means to develop dialogue across disciplines makes researchers better equipped to devise and tackle research problems across disciplines, potentially laying the ground for more effective collaboration.Keywords: interdisciplinary, method, research student, training
Procedia PDF Downloads 2061531 Phenotype Prediction of DNA Sequence Data: A Machine and Statistical Learning Approach
Authors: Mpho Mokoatle, Darlington Mapiye, James Mashiyane, Stephanie Muller, Gciniwe Dlamini
Abstract:
Great advances in high-throughput sequencing technologies have resulted in availability of huge amounts of sequencing data in public and private repositories, enabling a holistic understanding of complex biological phenomena. Sequence data are used for a wide range of applications such as gene annotations, expression studies, personalized treatment and precision medicine. However, this rapid growth in sequence data poses a great challenge which calls for novel data processing and analytic methods, as well as huge computing resources. In this work, a machine and statistical learning approach for DNA sequence classification based on $k$-mer representation of sequence data is proposed. The approach is tested using whole genome sequences of Mycobacterium tuberculosis (MTB) isolates to (i) reduce the size of genomic sequence data, (ii) identify an optimum size of k-mers and utilize it to build classification models, (iii) predict the phenotype from whole genome sequence data of a given bacterial isolate, and (iv) demonstrate computing challenges associated with the analysis of whole genome sequence data in producing interpretable and explainable insights. The classification models were trained on 104 whole genome sequences of MTB isoloates. Cluster analysis showed that k-mers maybe used to discriminate phenotypes and the discrimination becomes more concise as the size of k-mers increase. The best performing classification model had a k-mer size of 10 (longest k-mer) an accuracy, recall, precision, specificity, and Matthews Correlation coeffient of 72.0%, 80.5%, 80.5%, 63.6%, and 0.4 respectively. This study provides a comprehensive approach for resampling whole genome sequencing data, objectively selecting a k-mer size, and performing classification for phenotype prediction. The analysis also highlights the importance of increasing the k-mer size to produce more biological explainable results, which brings to the fore the interplay that exists amongst accuracy, computing resources and explainability of classification results. However, the analysis provides a new way to elucidate genetic information from genomic data, and identify phenotype relationships which are important especially in explaining complex biological mechanisms.Keywords: AWD-LSTM, bootstrapping, k-mers, next generation sequencing
Procedia PDF Downloads 1671530 Phenotype Prediction of DNA Sequence Data: A Machine and Statistical Learning Approach
Authors: Darlington Mapiye, Mpho Mokoatle, James Mashiyane, Stephanie Muller, Gciniwe Dlamini
Abstract:
Great advances in high-throughput sequencing technologies have resulted in availability of huge amounts of sequencing data in public and private repositories, enabling a holistic understanding of complex biological phenomena. Sequence data are used for a wide range of applications such as gene annotations, expression studies, personalized treatment and precision medicine. However, this rapid growth in sequence data poses a great challenge which calls for novel data processing and analytic methods, as well as huge computing resources. In this work, a machine and statistical learning approach for DNA sequence classification based on k-mer representation of sequence data is proposed. The approach is tested using whole genome sequences of Mycobacterium tuberculosis (MTB) isolates to (i) reduce the size of genomic sequence data, (ii) identify an optimum size of k-mers and utilize it to build classification models, (iii) predict the phenotype from whole genome sequence data of a given bacterial isolate, and (iv) demonstrate computing challenges associated with the analysis of whole genome sequence data in producing interpretable and explainable insights. The classification models were trained on 104 whole genome sequences of MTB isoloates. Cluster analysis showed that k-mers maybe used to discriminate phenotypes and the discrimination becomes more concise as the size of k-mers increase. The best performing classification model had a k-mer size of 10 (longest k-mer) an accuracy, recall, precision, specificity, and Matthews Correlation coeffient of 72.0 %, 80.5 %, 80.5 %, 63.6 %, and 0.4 respectively. This study provides a comprehensive approach for resampling whole genome sequencing data, objectively selecting a k-mer size, and performing classification for phenotype prediction. The analysis also highlights the importance of increasing the k-mer size to produce more biological explainable results, which brings to the fore the interplay that exists amongst accuracy, computing resources and explainability of classification results. However, the analysis provides a new way to elucidate genetic information from genomic data, and identify phenotype relationships which are important especially in explaining complex biological mechanismsKeywords: AWD-LSTM, bootstrapping, k-mers, next generation sequencing
Procedia PDF Downloads 1591529 Robust Recognition of Locomotion Patterns via Data-Driven Machine Learning in the Cloud Environment
Authors: Shinoy Vengaramkode Bhaskaran, Kaushik Sathupadi, Sandesh Achar
Abstract:
Human locomotion recognition is important in a variety of sectors, such as robotics, security, healthcare, fitness tracking and cloud computing. With the increasing pervasiveness of peripheral devices, particularly Inertial Measurement Units (IMUs) sensors, researchers have attempted to exploit these advancements in order to precisely and efficiently identify and categorize human activities. This research paper introduces a state-of-the-art methodology for the recognition of human locomotion patterns in a cloud environment. The methodology is based on a publicly available benchmark dataset. The investigation implements a denoising and windowing strategy to deal with the unprocessed data. Next, feature extraction is adopted to abstract the main cues from the data. The SelectKBest strategy is used to abstract optimal features from the data. Furthermore, state-of-the-art ML classifiers are used to evaluate the performance of the system, including logistic regression, random forest, gradient boosting and SVM have been investigated to accomplish precise locomotion classification. Finally, a detailed comparative analysis of results is presented to reveal the performance of recognition models.Keywords: artificial intelligence, cloud computing, IoT, human locomotion, gradient boosting, random forest, neural networks, body-worn sensors
Procedia PDF Downloads 111528 Hybrid Genetic Approach for Solving Economic Dispatch Problems with Valve-Point Effect
Authors: Mohamed I. Mahrous, Mohamed G. Ashmawy
Abstract:
Hybrid genetic algorithm (HGA) is proposed in this paper to determine the economic scheduling of electric power generation over a fixed time period under various system and operational constraints. The proposed technique can outperform conventional genetic algorithms (CGAs) in the sense that HGA make it possible to improve both the quality of the solution and reduce the computing expenses. In contrast, any carefully designed GA is only able to balance the exploration and the exploitation of the search effort, which means that an increase in the accuracy of a solution can only occure at the sacrifice of convergent speed, and vice visa. It is unlikely that both of them can be improved simultaneously. The proposed hybrid scheme is developed in such a way that a simple GA is acting as a base level search, which makes a quick decision to direct the search towards the optimal region, and a local search method (pattern search technique) is next employed to do the fine tuning. The aim of the strategy is to achieve the cost reduction within a reasonable computing time. The effectiveness of the proposed hybrid technique is verified on two real public electricity supply systems with 13 and 40 generator units respectively. The simulation results obtained with the HGA for the two real systems are very encouraging with regard to the computational expenses and the cost reduction of power generation.Keywords: genetic algorithms, economic dispatch, pattern search
Procedia PDF Downloads 4441527 An Analysis of Innovative Cloud Model as Bridging the Gap between Physical and Virtualized Business Environments: The Customer Perspective
Authors: Asim Majeed, Rehan Bhana, Mak Sharma, Rebecca Goode, Nizam Bolia, Mike Lloyd-Williams
Abstract:
This study aims to investigate and explore the underlying causes of security concerns of customers emerged when WHSmith transformed its physical system to virtualized business model through NetSuite. NetSuite is essentially fully integrated software which helps transforming the physical system to virtualized business model. Modern organisations are moving away from traditional business models to cloud based models and consequently it is expected to have a better, secure and innovative environment for customers. The vital issue of the modern age race is the security when transforming virtualized through cloud based models and designers of interactive systems often misunderstand privacy and even often ignore it, thus causing concerns for users. The content analysis approach is being used to collect the qualitative data from 120 online bloggers including TRUSTPILOT. The results and finding provide useful new insights into the nature and form of security concerns of online users after they have used the WHSmith services offered online through their website. Findings have theoretical as well as practical implications for the successful adoption of cloud computing Business-to-Business model and similar systems.Keywords: innovation, virtualization, cloud computing, organizational flexibility
Procedia PDF Downloads 3841526 Islamic Banking Recovery Process and Its Parameters: A Practitioner’s Viewpoints in the Light of Humanising Financial Services
Authors: Muhammad Izzam Bin Mohd Khazar, Nur Adibah Binti Zainudin
Abstract:
Islamic banking as one of the financial institutions is highly required to maintain a prudent approach to ensure that any financing given is able to generate income to their respective shareholders. As the default payment of customers is probably occurred in the financing, having a prudent approach in the recovery process is a must to ensure that financing losses are within acceptable limits. The objective of this research is to provide the best practice of recovery which is anticipated to benefit both bank and customers. This study will address arising issue on the current practice of recovery process and followed by providing humanising recovery solutions in the light of the Maqasid Shariah. The study identified main issues pertaining to Islamic recovery process which can be categorized into knowledge crisis, process issues, specific treatment cases and system issues. Knowledge crisis is related to direct parties including judges, solicitors and salesperson, while the recovery process issues include the process of issuance of reminder, foreclosure and repossession of asset. Furthermore, special treatment for particular cases also should be observed since different contracts in Islamic banking products will need different treatment. Finally, issues in the system used in the recovery process are still unresolved since the existing technology is still young in this area to embraced Islamic finance requirements and nature of calculation. In order to humanize the financial services in Islamic banking recovery process, we have highlighted four main recommendation to be implemented by Islamic Financial Institutions namely; 1) early deterrent by improving the awareness, 2) improvement of the internal process, 3) reward mechanism, and 4) creative penalty to provide awareness to all stakeholders.Keywords: humanizing financial services, Islamic Finance, Maqasid Syariah, recovery process
Procedia PDF Downloads 205