Search results for: computing paradigm
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1816

Search results for: computing paradigm

736 Approximate-Based Estimation of Single Event Upset Effect on Statistic Random-Access Memory-Based Field-Programmable Gate Arrays

Authors: Mahsa Mousavi, Hamid Reza Pourshaghaghi, Mohammad Tahghighi, Henk Corporaal

Abstract:

Recently, Statistic Random-Access Memory-based (SRAM-based) Field-Programmable Gate Arrays (FPGAs) are widely used in aeronautics and space systems where high dependability is demanded and considered as a mandatory requirement. Since design’s circuit is stored in configuration memory in SRAM-based FPGAs; they are very sensitive to Single Event Upsets (SEUs). In addition, the adverse effects of SEUs on the electronics used in space are much higher than in the Earth. Thus, developing fault tolerant techniques play crucial roles for the use of SRAM-based FPGAs in space. However, fault tolerance techniques introduce additional penalties in system parameters, e.g., area, power, performance and design time. In this paper, an accurate estimation of configuration memory vulnerability to SEUs is proposed for approximate-tolerant applications. This vulnerability estimation is highly required for compromising between the overhead introduced by fault tolerance techniques and system robustness. In this paper, we study applications in which the exact final output value is not necessarily always a concern meaning that some of the SEU-induced changes in output values are negligible. We therefore define and propose Approximate-based Configuration Memory Vulnerability Factor (ACMVF) estimation to avoid overestimating configuration memory vulnerability to SEUs. In this paper, we assess the vulnerability of configuration memory by injecting SEUs in configuration memory bits and comparing the output values of a given circuit in presence of SEUs with expected correct output. In spite of conventional vulnerability factor calculation methods, which accounts any deviations from the expected value as failures, in our proposed method a threshold margin is considered depending on user-case applications. Given the proposed threshold margin in our model, a failure occurs only when the difference between the erroneous output value and the expected output value is more than this margin. The ACMVF is subsequently calculated by acquiring the ratio of failures with respect to the total number of SEU injections. In our paper, a test-bench for emulating SEUs and calculating ACMVF is implemented on Zynq-7000 FPGA platform. This system makes use of the Single Event Mitigation (SEM) IP core to inject SEUs into configuration memory bits of the target design implemented in Zynq-7000 FPGA. Experimental results for 32-bit adder show that, when 1% to 10% deviation from correct output is considered, the counted failures number is reduced 41% to 59% compared with the failures number counted by conventional vulnerability factor calculation. It means that estimation accuracy of the configuration memory vulnerability to SEUs is improved up to 58% in the case that 10% deviation is acceptable in output results. Note that less than 10% deviation in addition result is reasonably tolerable for many applications in approximate computing domain such as Convolutional Neural Network (CNN).

Keywords: fault tolerance, FPGA, single event upset, approximate computing

Procedia PDF Downloads 198
735 Advanced Digital Manufacturing: Case Study

Authors: Abdelrahman Abdelazim

Abstract:

Most industries are looking for technologies that are easy to use, efficient and fast to accomplish. To implement these, factories tend to use advanced systems that could alter complicity to simplicity and rudimentary to advancement. Cloud Manufacturing is a new movement that aims to mirror and integrate cloud computing into manufacturing. Amongst cloud manufacturing various advantages are decreasing the human involvements and increasing the dependency on automated machines, which in turns decreases human errors and increases efficiency. A reliable and extraordinary performance processes with minimum errors are highly desired factors of today’s manufacturers. At the glance it seems to be the best alternative, however, the implementation of a cloud system can be very challenging. This work investigates cloud manufacturing in details, it outlines its advantages and disadvantages by converting a local factory in Kuwait to a cloud-ready system. Initially the flow of the factory’s manufacturing process has been analyzed identifying the bottlenecks and illustrating how cloud manufacturing can eliminate them. Following this an automation process has been analyzed and implemented. A comparison between the process before and after the adaptation has been carried out showing the effects on the cost, the output and the efficiency of the process.

Keywords: cloud manufacturing, automation, Kuwait industrial sector, advanced digital manufacturing

Procedia PDF Downloads 771
734 From the Sharing Economy to Social Manufacturing: Analyzing Collaborative Service Networks in the Manufacturing Domain

Authors: Babak Mohajeri

Abstract:

In recent years, the conventional business model of ownership has been changed towards accessibility in a variety of markets. Two trends can be observed in the evolution of this rental-like business model. Firstly, the technological development that enables the emergence of new business models. These new business models increasingly become agile and flexible. For example Spotify, an online music stream company provides consumers access to over millions of music tracks, conveniently through the smartphone, tablet or computer. Similarly, Car2Go, the car sharing company accesses its members with flexible and nearby sharing cars. The second trend is the increasing communication and connections via social networks. This trend enables a shift to peer-to-peer accessibility based business models. Conventionally, companies provide access for their customers to own companies products or services. In peer-to-peer model, nonetheless, companies facilitate access and connection across their customers to use other customers owned property or skills, competencies or services .The is so-called the sharing economy business model. The aim of this study is to investigate into a new and emerging type of the sharing economy model in which role of customers and service providers may dramatically change. This new model is called Collaborative Service Networks. We propose a mechanism for Collaborative Service Networks business model. Uber and Airbnb, two successful growing companies, have been selected for our case studies and their business models are analyzed. Finally, we study the emergence of the collaborative service networks in the manufacturing domain. Our finding results to a new manufacturing paradigm called social manufacturing.

Keywords: sharing economy, collaborative service networks, social manufacturing, manufacturing development

Procedia PDF Downloads 317
733 Cybersecurity Challenges in the Era of Open Banking

Authors: Krish Batra

Abstract:

The advent of open banking has revolutionized the financial services industry by fostering innovation, enhancing customer experience, and promoting competition. However, this paradigm shift towards more open and interconnected banking ecosystems has introduced complex cybersecurity challenges. This research paper delves into the multifaceted cybersecurity landscape of open banking, highlighting the vulnerabilities and threats inherent in sharing financial data across a network of banks and third-party providers. Through a detailed analysis of recent data breaches, phishing attacks, and other cyber incidents, the paper assesses the current state of cybersecurity within the open banking framework. It examines the effectiveness of existing security measures, such as encryption, API security protocols, and authentication mechanisms, in protecting sensitive financial information. Furthermore, the paper explores the regulatory response to these challenges, including the implementation of standards such as PSD2 in Europe and similar initiatives globally. By identifying gaps in current cybersecurity practices, the research aims to propose a set of robust, forward-looking strategies that can enhance the security and resilience of open banking systems. This includes recommendations for banks, third-party providers, regulators, and consumers on how to mitigate risks and ensure a secure open banking environment. The ultimate goal is to provide stakeholders with a comprehensive understanding of the cybersecurity implications of open banking and to outline actionable steps for safeguarding the financial ecosystem in an increasingly interconnected world.

Keywords: open banking, financial services industry, cybersecurity challenges, data breaches, phishing attacks, encryption, API security protocols, authentication mechanisms, regulatory response, PSD2, cybersecurity practices

Procedia PDF Downloads 60
732 Analyzing the Quality of Cloud-Based E-Learning Systems on the Perception of the Learners and the Teachers

Authors: R. W. C. Devindi, S. M. Buddika Harshanath

Abstract:

E-learning is a widely used technology for learning in the modern world. With the pandemic situation the popularity of using e-learning has been increased in a larger capacity. The e-learning educational systems require software resources as well as hardware usually but it is hard for most of the education institutions to afford those resources. Also with the massive user load e-learning has to broaden the server side resources as well. Therefore, in the present cloud computing was implemented in order to make the e – learning systems more efficient. The researcher has analyzed the quality of the e-learning systems on the perception of the learners and the teachers with the aid of hypothesis and has given the analyzed results and the discussion in this report. Therefore, the future research will be able to get some steps to increase the quality of the online learning systems furthermore. In the case of e-learning, quality assurance and cost effectiveness are essential. A complex quality assurance system is used in the stated project. There are no well-defined standard evaluation measures in this field. As a result, accurately assessing the e-learning system's overall quality is challenging. The researcher has done the analysis with the aid of standard methods and software.

Keywords: LMS–learning management system, SPSS–statistical package for social sciences (software), eigen value, hypothesis

Procedia PDF Downloads 107
731 Forming-Free Resistive Switching Effect in ZnₓTiᵧHfzOᵢ Nanocomposite Thin Films for Neuromorphic Systems Manufacturing

Authors: Vladimir Smirnov, Roman Tominov, Vadim Avilov, Oleg Ageev

Abstract:

The creation of a new generation micro- and nanoelectronics elements opens up unlimited possibilities for electronic devices parameters improving, as well as developing neuromorphic computing systems. Interest in the latter is growing up every year, which is explained by the need to solve problems related to the unstructured classification of data, the construction of self-adaptive systems, and pattern recognition. However, for its technical implementation, it is necessary to fulfill a number of conditions for the basic parameters of electronic memory, such as the presence of non-volatility, the presence of multi-bitness, high integration density, and low power consumption. Several types of memory are presented in the electronics industry (MRAM, FeRAM, PRAM, ReRAM), among which non-volatile resistive memory (ReRAM) is especially distinguished due to the presence of multi-bit property, which is necessary for neuromorphic systems manufacturing. ReRAM is based on the effect of resistive switching – a change in the resistance of the oxide film between low-resistance state (LRS) and high-resistance state (HRS) under an applied electric field. One of the methods for the technical implementation of neuromorphic systems is cross-bar structures, which are ReRAM cells, interconnected by cross data buses. Such a structure imitates the architecture of the biological brain, which contains a low power computing elements - neurons, connected by special channels - synapses. The choice of the ReRAM oxide film material is an important task that determines the characteristics of the future neuromorphic system. An analysis of literature showed that many metal oxides (TiO2, ZnO, NiO, ZrO2, HfO2) have a resistive switching effect. It is worth noting that the manufacture of nanocomposites based on these materials allows highlighting the advantages and hiding the disadvantages of each material. Therefore, as a basis for the neuromorphic structures manufacturing, it was decided to use ZnₓTiᵧHfzOᵢ nanocomposite. It is also worth noting that the ZnₓTiᵧHfzOᵢ nanocomposite does not need an electroforming, which degrades the parameters of the formed ReRAM elements. Currently, this material is not well studied, therefore, the study of the effect of resistive switching in forming-free ZnₓTiᵧHfzOᵢ nanocomposite is an important task and the goal of this work. Forming-free nanocomposite ZnₓTiᵧHfzOᵢ thin film was grown by pulsed laser deposition (Pioneer 180, Neocera Co., USA) on the SiO2/TiN (40 nm) substrate. Electrical measurements were carried out using a semiconductor characterization system (Keithley 4200-SCS, USA) with W probes. During measurements, TiN film was grounded. The analysis of the obtained current-voltage characteristics showed a resistive switching from HRS to LRS resistance states at +1.87±0.12 V, and from LRS to HRS at -2.71±0.28 V. Endurance test shown that HRS was 283.21±32.12 kΩ, LRS was 1.32±0.21 kΩ during 100 measurements. It was shown that HRS/LRS ratio was about 214.55 at reading voltage of 0.6 V. The results can be useful for forming-free nanocomposite ZnₓTiᵧHfzOᵢ films in neuromorphic systems manufacturing. This work was supported by RFBR, according to the research project № 19-29-03041 mk. The results were obtained using the equipment of the Research and Education Center «Nanotechnologies» of Southern Federal University.

Keywords: nanotechnology, nanocomposites, neuromorphic systems, RRAM, pulsed laser deposition, resistive switching effect

Procedia PDF Downloads 132
730 Development of Programmed Cell Death Protein 1 Pathway-Associated Prognostic Biomarkers for Bladder Cancer Using Transcriptomic Databases

Authors: Shu-Pin Huang, Pai-Chi Teng, Hao-Han Chang, Chia-Hsin Liu, Yung-Lun Lin, Shu-Chi Wang, Hsin-Chih Yeh, Chih-Pin Chuu, Jiun-Hung Geng, Li-Hsin Chang, Wei-Chung Cheng, Chia-Yang Li

Abstract:

The emergence of immune checkpoint inhibitors (ICIs) targeting proteins like PD-1 and PD-L1 has changed the treatment paradigm of bladder cancer. However, not all patients benefit from ICIs, with some experiencing early death. There's a significant need for biomarkers associated with the PD-1 pathway in bladder cancer. Current biomarkers focus on tumor PD-L1 expression, but a more comprehensive understanding of PD-1-related biology is needed. Our study has developed a seven-gene risk score panel, employing a comprehensive bioinformatics strategy, which could serve as a potential prognostic and predictive biomarker for bladder cancer. This panel incorporates the FYN, GRAP2, TRIB3, MAP3K8, AKT3, CD274, and CD80 genes. Additionally, we examined the relationship between this panel and immune cell function, utilizing validated tools such as ESTIMATE, TIDE, and CIBERSORT. Our seven-genes panel has been found to be significantly associated with bladder cancer survival in two independent cohorts. The panel was also significantly correlated with tumor infiltration lymphocytes, immune scores, and tumor purity. These factors have been previously reported to have clinical implications on ICIs. The findings suggest the potential of a PD-1 pathway-based transcriptomic panel as a prognostic and predictive biomarker in bladder cancer, which could help optimize treatment strategies and improve patient outcomes.

Keywords: bladder cancer, programmed cell death protein 1, prognostic biomarker, immune checkpoint inhibitors, predictive biomarker

Procedia PDF Downloads 78
729 Beyond Classic Program Evaluation and Review Technique: A Generalized Model for Subjective Distributions with Flexible Variance

Authors: Byung Cheol Kim

Abstract:

The Program Evaluation and Review Technique (PERT) is widely used for project management, but it struggles with subjective distributions, particularly due to its assumptions of constant variance and light tails. To overcome these limitations, we propose the Generalized PERT (G-PERT) model, which enhances PERT by incorporating variability in three-point subjective estimates. Our methodology extends the original PERT model to cover the full range of unimodal beta distributions, enabling the model to handle thick-tailed distributions and offering formulas for computing mean and variance. This maintains the simplicity of PERT while providing a more accurate depiction of uncertainty. Our empirical analysis demonstrates that the G-PERT model significantly improves performance, particularly when dealing with heavy-tail subjective distributions. In comparative assessments with alternative models such as triangular and lognormal distributions, G-PERT shows superior accuracy and flexibility. These results suggest that G-PERT offers a more robust solution for project estimation while still retaining the user-friendliness of the classic PERT approach.

Keywords: PERT, subjective distribution, project management, flexible variance

Procedia PDF Downloads 18
728 Spatial-Temporal Awareness Approach for Extensive Re-Identification

Authors: Tyng-Rong Roan, Fuji Foo, Wenwey Hseush

Abstract:

Recent development of AI and edge computing plays a critical role to capture meaningful events such as detection of an unattended bag. One of the core problems is re-identification across multiple CCTVs. Immediately following the detection of a meaningful event is to track and trace the objects related to the event. In an extensive environment, the challenge becomes severe when the number of CCTVs increases substantially, imposing difficulties in achieving high accuracy while maintaining real-time performance. The algorithm that re-identifies cross-boundary objects for extensive tracking is referred to Extensive Re-Identification, which emphasizes the issues related to the complexity behind a great number of CCTVs. The Spatial-Temporal Awareness approach challenges the conventional thinking and concept of operations which is labor intensive and time consuming. The ability to perform Extensive Re-Identification through a multi-sensory network provides the next-level insights – creating value beyond traditional risk management.

Keywords: long-short-term memory, re-identification, security critical application, spatial-temporal awareness

Procedia PDF Downloads 112
727 Qualitative Case Studies in Reading Specialist Education

Authors: Carol Leroy

Abstract:

This presentation focuses on the analysis qualitative case studies in the graduate education of reading specialists. The presentation describes the development and application of an integrated conceptual framework for reading specialist education, drawing on Robert Stake’s work on case study research, Kenneth Zeichner’s work on professional learning, and various tools for reading assessment (e.g. the Qualitative Reading Inventory). Social constructivist theory is used to provide intersecting links between the various influences on the processes used to assess and teaching reading within the case study framework. Illustrative examples are described to show the application of the framework in reading specialist education in a teaching clinic at a large urban university. Central to education of reading specialists in this teaching clinic is the collection, analysis and interpretation of data for the design and implementation of reading and writing programs for struggling readers and writers. The case study process involves the integrated interpretation of data, which is central to qualitative case study inquiry. An emerging theme in this approach to graduate education is the ambiguity and uncertainty that governs work with the adults and children who attend the clinic for assistance. Tensions and contradictions are explored insofar as they reveal overlapping but intersecting frameworks for case study analysis in the area of literacy education. An additional theme is the interplay of multiple layers of data with a resulting depth that goes beyond the practical need of the client and toward the deeper pedagogical growth of the reading specialist. The presentation makes a case for the value of qualitative case studies in reading specialist education. Further, the use of social constructivism as a unifying paradigm provides a robustness to the conceptual framework as a tool for understanding the pedagogy that is involved.

Keywords: assessment, case study, professional education, reading

Procedia PDF Downloads 458
726 English as a Medium of Instruction in Algerian Higher Business Degree Programmes

Authors: Sidi Ahmed Berrabah

Abstract:

English as a Medium of Instruction (EMI) is expanding rapidly in the world. A growing volume of research has been dedicated to investigating its introduction, with findings that describe a complex picture and suggest that the practicality and effectiveness of EMI are still the subjects of debate. However, considerably less attention has been given to understanding EMI in a context where its introduction has been discussed but not yet put into practice. One such context is Algeria, where discourses about a potential introduction of EMI have been going on for some time. It is likely that the first courses where EMI is introduced are Business degree programmes. This study aims to examine the current discourses and attitudes towards the potential implementation of EMI and the language practices in Business degree programmes in three Algerian universities. The research is conducted in three different universities in three different regions in Algeria with the aim of including both ‘centre’ and ‘periphery’ Algerian universities. In order to achieve the previous aims, a mixed research paradigm is used. Questionnaires, semi structured interviews, and classroom observations are used to gather data from three participant cohorts: university students of Business, lecturers of Business, and lecturers of English for specific purposes. The findings showed that students and lecturers of Business are found in favour of the introduction of English instead of French or standard Arabic as a medium of instruction. The reason is that English is seen as having internationalisation and instrumental benefits, while French was too closely linked to the colonial history of the country. The favourable attitudes towards EMI, however, seem to contrast with the daily classroom practices at the departments of Business studies, where students and lecturers make practical choices of using their language repertoire based on their linguistic background and skills. Classrooms in the three Algerian universities featured fluid and translanguaging practices that cannot be reduced to a monolingual EMI policy.

Keywords: EMI, Algerian universities, business degree programmes, translanguaging

Procedia PDF Downloads 213
725 Impact of Lifelong-Learning Mindset on Career Success of the Accounting and Finance Professionals

Authors: R. W. A. V. A. Wijenayake, P. M. R. N. Fernando, S. Nilesh, M. D. G. M. S. Diddeniya, M. Weligodapola, P. Shamila

Abstract:

The study is designed to examine the impact of a lifelong learning mindset on the career success of accounting and finance professionals in the western province of Sri Lanka. The learning mindset impacts the career success of accounting and finance professionals. The main objective of this study is to identify how the lifelong-learning mindset impacts on the career success of accounting and finance professionals. The lifelong learning mindset is the desire to learn new things and curiosity, resilience, and strategic thinking are the selected constructs to measure the lifelong learning mindset. Career success refers to certain objectives and emotional measures of improvement in one’s work life. The related variables of career success are measured through the number of promotions that have been granted in his/her work life. Positivism is the research paradigm, and the deductive approach is involved as this study relies on testing an existing theory. To conduct the study, the accounting and finance professionals in the western province in Sri Lanka were selected because most reputed international and local companies and specifically, headquarters of most of the companies are in western province. The responses cannot be collected from the whole population. Therefore, this study used a simple random sampling method, and the sample size was 120. Therefore, to identify the impact, 5-point Likert scale is used to perform this quantitative data. Required data gathered through an online questionnaire and the final outputs of the study will offer certain important recommendations to several parties such as universities, undergraduates, companies, and the policymakers to improve, help mentally and financially and motivate the students and the employees to continue their studies without ceasing after completion of their degree.

Keywords: career success, curiosity, lifelong learning mindset, resilience, strategic thinking

Procedia PDF Downloads 86
724 A Resource Optimization Strategy for CPU (Central Processing Unit) Intensive Applications

Authors: Junjie Peng, Jinbao Chen, Shuai Kong, Danxu Liu

Abstract:

On the basis of traditional resource allocation strategies, the usage of resources on physical servers in cloud data center is great uncertain. It will cause waste of resources if the assignment of tasks is not enough. On the contrary, it will cause overload if the assignment of tasks is too much. This is especially obvious when the applications are the same type because of its resource preferences. Considering CPU intensive application is one of the most common types of application in the cloud, we studied the optimization strategy for CPU intensive applications on the same server. We used resource preferences to analyze the case that multiple CPU intensive applications run simultaneously, and put forward a model which can predict the execution time for CPU intensive applications which run simultaneously. Based on the prediction model, we proposed the method to select the appropriate number of applications for a machine. Experiments show that the model can predict the execution time accurately for CPU intensive applications. To improve the execution efficiency of applications, we propose a scheduling model based on priority for CPU intensive applications. Extensive experiments verify the validity of the scheduling model.

Keywords: cloud computing, CPU intensive applications, resource optimization, strategy

Procedia PDF Downloads 278
723 A Genre Analysis of University Lectures

Authors: Lee Kok Yueh, Fatin Hamadah Rahman, David Hassell, Au Thien Wan

Abstract:

This work reports on a genre based study of lectures at a University in Brunei, Universiti Teknologi Brunei to explore the communicative functions and to gain insight into the discourse. It explores these in three different domains; Social Science, Engineering and Computing. Audio recordings from four lecturers comprising 20 lectures were transcribed and analysed, with the duration of each lecture varying between 20 to 90 minutes. This qualitative study found similar patterns and functions of lectures as those found in existing research amongst which include greetings, housekeeping, or recapping of previous lectures in the lecture introductions. In the lecture content, comprehension check and use of examples or analogies are very prevalent. However, the use of examples largely depend on the lecture content; and the more technical the content, the harder it was for lecturers to provide examples or analogies. Three functional moves are identified in the lecture conclusions; announcement, summary and future plan, all of which are optional. Despite the relatively small sample size, the present study shows that lectures are interactive and there are some consistencies with the delivery of lecture in relation to the communicative functions and genre of lecture.

Keywords: communicative functions, genre analysis, higher education, lectures

Procedia PDF Downloads 191
722 Design and Development of Data Mining Application for Medical Centers in Remote Areas

Authors: Grace Omowunmi Soyebi

Abstract:

Data Mining is the extraction of information from a large database which helps in predicting a trend or behavior, thereby helping management make knowledge-driven decisions. One principal problem of most hospitals in rural areas is making use of the file management system for keeping records. A lot of time is wasted when a patient visits the hospital, probably in an emergency, and the nurse or attendant has to search through voluminous files before the patient's file can be retrieved; this may cause an unexpected to happen to the patient. This Data Mining application is to be designed using a Structured System Analysis and design method, which will help in a well-articulated analysis of the existing file management system, feasibility study, and proper documentation of the Design and Implementation of a Computerized medical record system. This Computerized system will replace the file management system and help to easily retrieve a patient's record with increased data security, access clinical records for decision-making, and reduce the time range at which a patient gets attended to.

Keywords: data mining, medical record system, systems programming, computing

Procedia PDF Downloads 209
721 Flexible Cities: A Multisided Spatial Application of Tracking Livability of Urban Environment

Authors: Maria Christofi, George Plastiras, Rafaella Elia, Vaggelis Tsiourtis, Theocharis Theocharides, Miltiadis Katsaros

Abstract:

The rapidly expanding urban areas of the world constitute a challenge of how we need to make the transition to "the next urbanization", which will be defined by new analytical tools and new sources of data. This paper is about the production of a spatial application, the ‘FUMapp’, where space and its initiative will be available literally, in meters, but also abstractly, at a sensed level. While existing spatial applications typically focus on illustrations of the urban infrastructure, the suggested application goes beyond the existing: It investigates how our environment's perception adapts to the alterations of the built environment through a dataset construction of biophysical measurements (eye-tracking, heart beating), and physical metrics (spatial characteristics, size of stimuli, rhythm of mobility). It explores the intersections between architecture, cognition, and computing where future design can be improved and identifies the flexibility and livability of the ‘available space’ of specific examined urban paths.

Keywords: biophysical data, flexibility of urban, livability, next urbanization, spatial application

Procedia PDF Downloads 142
720 Automatic Classification of Periodic Heart Sounds Using Convolutional Neural Network

Authors: Jia Xin Low, Keng Wah Choo

Abstract:

This paper presents an automatic normal and abnormal heart sound classification model developed based on deep learning algorithm. MITHSDB heart sounds datasets obtained from the 2016 PhysioNet/Computing in Cardiology Challenge database were used in this research with the assumption that the electrocardiograms (ECG) were recorded simultaneously with the heart sounds (phonocardiogram, PCG). The PCG time series are segmented per heart beat, and each sub-segment is converted to form a square intensity matrix, and classified using convolutional neural network (CNN) models. This approach removes the need to provide classification features for the supervised machine learning algorithm. Instead, the features are determined automatically through training, from the time series provided. The result proves that the prediction model is able to provide reasonable and comparable classification accuracy despite simple implementation. This approach can be used for real-time classification of heart sounds in Internet of Medical Things (IoMT), e.g. remote monitoring applications of PCG signal.

Keywords: convolutional neural network, discrete wavelet transform, deep learning, heart sound classification

Procedia PDF Downloads 348
719 Social Networks Global Impact on Protest Movements and Human Rights Activism

Authors: Marcya Burden, Savonna Greer

Abstract:

In the wake of social unrest around the world, protest movements have been captured like never before. As protest movements have evolved, so too have their visibility and sources of coverage. Long gone are the days of print media as our only glimpse into the action surrounding a protest. Now, with social networks such as Facebook, Instagram and Snapchat, we have access to real-time video footage of protest movements and human rights activism that can reach millions of people within seconds. This research paper investigated various social media network platforms’ statistical usage data in the areas of human rights activism and protest movements, paralleling with other past forms of media coverage. This research demonstrates that social networks are extremely important to protest movements and human rights activism. With over 2.9 billion users across social media networks globally, these platforms are the heart of most recent protests and human rights activism. This research shows the paradigm shift from the Selma March of 1965 to the more recent protests of Ferguson in 2014, Ni Una Menos in 2015, and End Sars in 2018. The research findings demonstrate that today, almost anyone may use their social networks to protest movement leaders and human rights activists. From a student to an 80-year-old professor, the possibility of reaching billions of people all over the world is limitless. Findings show that 82% of the world’s internet population is on social networks 1 in every 5 minutes. Over 65% of Americans believe social media highlights important issues. Thus, there is no need to have a formalized group of people or even be known online. A person simply needs to be engaged on their respective social media networks (Facebook, Twitter, Instagram, Snapchat) regarding any cause they are passionate about. Information may be exchanged in real time around the world and a successful protest can begin.

Keywords: activism, protests, human rights, networks

Procedia PDF Downloads 95
718 Research on the Aero-Heating Prediction Based on Hybrid Meshes and Hybrid Schemes

Authors: Qiming Zhang, Youda Ye, Qinxue Jiang

Abstract:

Accurate prediction of external flowfield and aero-heating at the wall of hypersonic vehicle is very crucial for the design of aircrafts. Unstructured/hybrid meshes have more powerful advantages than structured meshes in terms of pre-processing, parallel computing and mesh adaptation, so it is imperative to develop high-resolution numerical methods for the calculation of aerothermal environment on unstructured/hybrid meshes. The inviscid flux scheme is one of the most important factors affecting the accuracy of unstructured/ hybrid mesh heat flux calculation. Here, a new hybrid flux scheme is developed and the approach of interface type selection is proposed: i.e. 1) using the exact Riemann scheme solution to calculate the flux on the faces parallel to the wall; 2) employing Sterger-Warming (S-W) scheme to improve the stability of the numerical scheme in other interfaces. The results of the heat flux fit the one observed experimentally and have little dependence on grids, which show great application prospect in unstructured/ hybrid mesh.

Keywords: aero-heating prediction, computational fluid dynamics, hybrid meshes, hybrid schemes

Procedia PDF Downloads 249
717 In Exploring Local Community Empowerment and Participation in Blue Tourism Activities

Authors: Philasande Runeli, Lynn Jonas

Abstract:

Empowerment suggests participation is working collaboratively towards shared objectives, obtaining resources and critically analysing one’s social and political differences are all necessary steps in the empowering process. The aim of leadership empowerment is to give a team the resources and encouragement they need to work more productively together. This study explores potential ways to increase local empowerment and participation in blue tourism activities in an urban coastal context in South Africa. Blue tourism, which refers to the application of sustainability practices to tourism activities in coastal and marine settings, has the potential to significantly improve socioeconomic conditions in coastal communities. However, people's engagement in these activities remain restricted. The study uses a constructivist research paradigm and employs a qualitative method, conducting semi-structured interviews with community members from three different communities gaining in-depth perspectives from them. The study's goal is to identify impediments and potential for community participation in blue tourism, as well as offering practical solutions for promoting long-term and inclusive participation. Initial key findings highlight critical barriers to participation, emphasising the importance of skills development, policy alignment with local needs, and public-private partnerships as key components of community empowerment. This study offers policymakers and stakeholders recommendations for promoting inclusive blue tourism initiatives. The recommended initiatives emphasise the significance of skills development, infrastructure investment, and sustainable tourism models in ensuring economic empowerment and environmental conservation in urban coastal communities in developing states.

Keywords: blue tourism, community empowerment and participation, sustainable tourism models, inclusive participation

Procedia PDF Downloads 19
716 Distributed Manufacturing (DM)- Smart Units and Collaborative Processes

Authors: Hermann Kuehnle

Abstract:

Developments in ICT totally reshape manufacturing as machines, objects and equipment on the shop floors will be smart and online. Interactions with virtualizations and models of a manufacturing unit will appear exactly as interactions with the unit itself. These virtualizations may be driven by providers with novel ICT services on demand that might jeopardize even well established business models. Context aware equipment, autonomous orders, scalable machine capacity or networkable manufacturing unit will be the terminology to get familiar with in manufacturing and manufacturing management. Such newly appearing smart abilities with impact on network behavior, collaboration procedures and human resource development will make distributed manufacturing a preferred model to produce. Computing miniaturization and smart devices revolutionize manufacturing set ups, as virtualizations and atomization of resources unwrap novel manufacturing principles. Processes and resources obey novel specific laws and have strategic impact on manufacturing and major operational implications. Mechanisms from distributed manufacturing engaging interacting smart manufacturing units and decentralized planning and decision procedures already demonstrate important effects from this shift of focus towards collaboration and interoperability.

Keywords: autonomous unit, networkability, smart manufacturing unit, virtualization

Procedia PDF Downloads 526
715 Holistic and Naturalistic Traditions of British Hygiene and Medicine, Reflected in E. W. Lane's Hygienic Medicine, 1859

Authors: Min Bae

Abstract:

Hygiene had traditionally meant ways of healthy and right living. However, the nineteenth century was the time when a gradual shift in medical and hygienic paradigms took place from holism to reductionism. Against this medical and social background, E. W. Lane (MD, Edinburgh, 1853) formulated his own medical philosophies in his book Hydropathy: Or Hygienic Medicine (1859). Until the 1880s when he published his last book on the hygienic medicine, he consistently intended to raise the importance of hygienic holism in medicine, while adopting hydropathy as his main therapeutic measure. Lane’s case reflects the mid-nineteenth century trend in which since the 1840s, the rational and holistic facets in medicine had significantly transferred to hydropathy, which was the most naturalistic healing system in the medical market. Hygiene for Lane was no longer the ancient form of ‘six non-naturals’. He emphasised physiology as the rational grounds for his project of the medicalisation of hygiene. His medical philosophy was profoundly naturalistic and holistic against the opposite trend of the contemporary hygiene and medicine. Conflicting aspects may often be best embodied in persons who stood on the boundaries between inside and outside. Lane’s theories on hygienic medicine did not develop into a new medical system which he believed would reconciliate orthodox medicine and hydropathy of his time had also adopted increasingly reductionist approaches since 1860s. Nevertheless, the naturalistic philosophies and approaches in Lane’s hygienic medicine demonstrates a continuous effort for a theoretical reformulation of hydropathy during its stagnant and declining period to constantly fit into the holistic paradigm of medicine and hygiene. Considering the fact that the nature cure concept in hydropathy and its individualistic approach were succeeded by naturopathy at the end of the century, analysis of Lane’s medical thoughts reveals part of a ‘thin red line’ of naturalism in the battleground between reductionism and holism during the nineteenth century in the history of medicine and hygiene.

Keywords: E. W. Lane, hygienic medicine, hydropathy, naturopath

Procedia PDF Downloads 334
714 Scalable Cloud-Based LEO Satellite Constellation Simulator

Authors: Karim Sobh, Khaled El-Ayat, Fady Morcos, Amr El-Kadi

Abstract:

Distributed applications deployed on LEO satellites and ground stations require substantial communication between different members in a constellation to overcome the earth coverage barriers imposed by GEOs. Applications running on LEO constellations suffer the earth line-of-sight blockage effect. They need adequate lab testing before launching to space. We propose a scalable cloud-based net-work simulation framework to simulate problems created by the earth line-of-sight blockage. The framework utilized cloud IaaS virtual machines to simulate LEO satellites and ground stations distributed software. A factorial ANOVA statistical analysis is conducted to measure simulator overhead on overall communication performance. The results showed a very low simulator communication overhead. Consequently, the simulation framework is proposed as a candidate for testing LEO constellations with distributed software in the lab before space launch.

Keywords: LEO, cloud computing, constellation, satellite, network simulation, netfilter

Procedia PDF Downloads 386
713 Application of the Concept of Comonotonicity in Option Pricing

Authors: A. Chateauneuf, M. Mostoufi, D. Vyncke

Abstract:

Monte Carlo (MC) simulation is a technique that provides approximate solutions to a broad range of mathematical problems. A drawback of the method is its high computational cost, especially in a high-dimensional setting, such as estimating the Tail Value-at-Risk for large portfolios or pricing basket options and Asian options. For these types of problems, one can construct an upper bound in the convex order by replacing the copula by the comonotonic copula. This comonotonic upper bound can be computed very quickly, but it gives only a rough approximation. In this paper we introduce the Comonotonic Monte Carlo (CoMC) simulation, by using the comonotonic approximation as a control variate. The CoMC is of broad applicability and numerical results show a remarkable speed improvement. We illustrate the method for estimating Tail Value-at-Risk and pricing basket options and Asian options when the logreturns follow a Black-Scholes model or a variance gamma model.

Keywords: control variate Monte Carlo, comonotonicity, option pricing, scientific computing

Procedia PDF Downloads 515
712 Knowledge Based Liability for ISPs’ Copyright and Trademark Infringement in the EU E-Commerce Directive: Two Steps Behind the Philosophy of Computing Mind

Authors: Mohammad Sadeghi

Abstract:

The subject matter of this article is the efficiency of current knowledge standard to afford the legal integration regarding criteria and approaches to ISP knowledge standards, to shield ISP and copyright, trademark and other parties’ rights in the online information society. The EU recognizes the knowledge-based liability for intermediaries in the European Directive on Electronic Commerce, but the implication of all parties’ responsibility for combating infringement has been immolated by dominating attention on liability due to the lack of the appropriate legal mechanism to devote each party responsibility. Moreover, there is legal challenge on the applicability of knowledge-based liability on hosting services and information location tools service. The aim of this contribution is to discuss the advantages and disadvantages of ECD knowledge standard through case law with a special emphasis on duty of prevention and constructive knowledge role on internet service providers (ISP s’) to achieve fair balance between all parties rights.

Keywords: internet service providers, liability, copyright infringement, hosting, caching, mere conduit service, notice and takedown, E-commerce Directive

Procedia PDF Downloads 524
711 The Feminine Speech and the Ritual of Death in Albania

Authors: Aida Lamaj

Abstract:

Death is an inevitable phenomenon in our life, in the same way, are also the ritual of death accompanied by the dirge and the keening performed by men. Keening is a phenomenon common among all peoples, the instances in which the ritual of death and keening coincide, as a special phenomenon of its, are numerous given the fact that keening is an outcome of an extremely special emotional state. However, even during the ritual of death, every people try to display through words its qualities, a multitude of characteristics preserved and transmitted with fanaticism from one generation to the other. The ritual of death constitutes an important element of our tradition and at the same time a material always interesting to be studied in minute details. In this study, we have tried to limit ourselves to the feminine speech, since keening, in general in Albania has been carried out by women. Differences and similarities among keening on the national scale, from the diachronic and synchronic point of view, can be seen clearly if we compare the Albanian creations in different regions. The similarities and differences within the Albanian culture serve as a typical paradigm to study how the ancient elements of outlook that the Albanians have had on death, history, and the social organization in these regions have been preserved and transmitted and above all, in what way these feelings have been clothed from the linguistic point of view, the typologies of keening and of all of the ritual of death, which clearly shows archaic forms as well as new developments. These data have been gathered not only by conducting various surveys but also by observing closely the linguistic behavior of women in Albania during the ritual of death. The study has encompassed the popular lyric poetry as well as new entries, whereas from the geographic point of view we focus mainly in the Southern regions, although examples from other regions where Albanian speaking people live are also present. The main results of the study show that women use much more than men dialect form, peripheral language elements and descriptive elements during their speech in the ritual of death.

Keywords: feminine speech in Albania, linguistic characteristics of the dirge, ritual of death, the typologies of keening

Procedia PDF Downloads 162
710 An Integrated Web-Based Workflow System for Design of Computational Pipelines in the Cloud

Authors: Shuen-Tai Wang, Yu-Ching Lin

Abstract:

With more and more workflow systems adopting cloud as their execution environment, it presents various challenges that need to be addressed in order to be utilized efficiently. This paper introduces a method for resource provisioning based on our previous research of dynamic allocation and its pipeline processes. We present an abstraction for workload scheduling in which independent tasks get scheduled among various available processors of distributed computing for optimization. We also propose an integrated web-based workflow designer by taking advantage of the HTML5 technology and chaining together multiple tools. In order to make the combination of multiple pipelines executing on the cloud in parallel, we develop a script translator and an execution engine for workflow management in the cloud. All information is known in advance by the workflow engine and tasks are allocated according to the prior knowledge in the repository. This proposed effort has the potential to provide support for process definition, workflow enactment and monitoring of workflow processes. Users would benefit from the web-based system that allows creation and execution of pipelines without scripting knowledge.

Keywords: workflow systems, resources provisioning, workload scheduling, web-based, workflow engine

Procedia PDF Downloads 160
709 Play Based Practices in Early Childhood Curriculum: The Contribution of High Scope, Modern School Movement and Pedagogy of Participation

Authors: Dalila Lino

Abstract:

The power of play for learning and development in early childhood education is beyond question. The main goal of this study is to analyse how three contemporary early childhood pedagogical approaches, the High Scope, the Modern School Movement (MEM) and the Pedagogy of Participation integrate play in their curriculum development. From this main goal the following objectives emerged: (i) to characterize how play is integrated in the daily routine of the pedagogical approaches under study; (ii) to analyse the teachers’ role during children’s playing situations; (iii) to identify the types of play that children are more often involved. The methodology used is the qualitative approach and is situated under the interpretative paradigm. Data is collected through semi-structured interviews to 30 preschool teachers and through observations of typical daily routines. The participants are 30 Portuguese preschool classrooms attending children from 3 to 6 years and working with the High Scope curriculum (10 classrooms), the MEM (10 classrooms) and the Pedagogy of Participation (10 classrooms). The qualitative method of content analysis was used to analyse the data. To ensure confidentiality, no information is disclosed without participants' consent, and the interviews were transcribed and sent to the participants for a final revision. The results show that there are differences how play is integrated and promoted in the three pedagogical approaches. The teachers’ role when children are at play varies according the pedagogical approach adopted, and also according to the teachers’ understanding about the meaning of play. The study highlights the key role that early childhood curriculum models have to promote opportunities for children to play, and therefore to be involved in meaningful learning.

Keywords: curriculum models, early childhood education, pedagogy, play

Procedia PDF Downloads 207
708 Scalable CI/CD and Scalable Automation: Assisting in Optimizing Productivity and Fostering Delivery Expansion

Authors: Solanki Ravirajsinh, Kudo Kuniaki, Sharma Ankit, Devi Sherine, Kuboshima Misaki, Tachi Shuntaro

Abstract:

In software development life cycles, the absence of scalable CI/CD significantly impacts organizations, leading to increased overall maintenance costs, prolonged release delivery times, heightened manual efforts, and difficulties in meeting tight deadlines. Implementing CI/CD with standard serverless technologies using cloud services overcomes all the above-mentioned issues and helps organizations improve efficiency and faster delivery without the need to manage server maintenance and capacity. By integrating scalable CI/CD with scalable automation testing, productivity, quality, and agility are enhanced while reducing the need for repetitive work and manual efforts. Implementing scalable CI/CD for development using cloud services like ECS (Container Management Service), AWS Fargate, ECR (to store Docker images with all dependencies), Serverless Computing (serverless virtual machines), Cloud Log (for monitoring errors and logs), Security Groups (for inside/outside access to the application), Docker Containerization (Docker-based images and container techniques), Jenkins (CI/CD build management tool), and code management tools (GitHub, Bitbucket, AWS CodeCommit) can efficiently handle the demands of diverse development environments and are capable of accommodating dynamic workloads, increasing efficiency for faster delivery with good quality. CI/CD pipelines encourage collaboration among development, operations, and quality assurance teams by providing a centralized platform for automated testing, deployment, and monitoring. Scalable CI/CD streamlines the development process by automatically fetching the latest code from the repository every time the process starts, building the application based on the branches, testing the application using a scalable automation testing framework, and deploying the builds. Developers can focus more on writing code and less on managing infrastructure as it scales based on the need. Serverless CI/CD eliminates the need to manage and maintain traditional CI/CD infrastructure, such as servers and build agents, reducing operational overhead and allowing teams to allocate resources more efficiently. Scalable CI/CD adjusts the application's scale according to usage, thereby alleviating concerns about scalability, maintenance costs, and resource needs. Creating scalable automation testing using cloud services (ECR, ECS Fargate, Docker, EFS, Serverless Computing) helps organizations run more than 500 test cases in parallel, aiding in the detection of race conditions, performance issues, and reducing execution time. Scalable CI/CD offers flexibility, dynamically adjusting to varying workloads and demands, allowing teams to scale resources up or down as needed. It optimizes costs by only paying for the resources as they are used and increases reliability. Scalable CI/CD pipelines employ automated testing and validation processes to detect and prevent errors early in the development cycle.

Keywords: achieve parallel execution, cloud services, scalable automation testing, scalable continuous integration and deployment

Procedia PDF Downloads 43
707 Evaluation Means in English and Russian Academic Discourse: Through Comparative Analysis towards Translation

Authors: Albina Vodyanitskaya

Abstract:

Given the culture- and language-specific nature of evaluation, this phenomenon is widely studied around the linguistic world and may be regarded as a challenge for translators. Evaluation penetrates all the levels of a scientific text, influences its composition and the reader’s attitude towards the information presented. One of the most challenging and rarely studied phenomena is the individual style of the scientific writer, which is mostly reflected in the use of evaluative language means. The evaluative and expressive potential of a scientific text is becoming more and more welcoming area for researchers, which stems in the shift towards anthropocentric paradigm in linguistics. Other reasons include: the cognitive and psycholinguistic processes that accompany knowledge acquisition, a genre-determined nature of a scientific text, the increasing public concern about the quality of scientific papers and some such. One more important issue, is the fact that linguists all over the world still argue about the definition of evaluation and its functions in the text. The author analyzes various approaches towards the study of evaluation and scientific texts. A comparative analysis of English and Russian dissertations and other scientific papers with regard to evaluative language means reveals major differences and similarities between English and Russian scientific style. Though standardized and genre-specific, English scientific texts contain more figurative and expressive evaluative means than the Russian ones, which should be taken into account while translating scientific papers. The processes that evaluation undergoes while being expressed by means of a target language are also analyzed. The author offers a target-language-dependent strategy for the translation of evaluation in English and Russian scientific texts. The findings may contribute to the theory and practice of translation and can increase scientific writers’ awareness of inter-language and intercultural differences in evaluative language means.

Keywords: academic discourse, evaluation, scientific text, scientific writing, translation

Procedia PDF Downloads 354