Search results for: elliptic curve digital signature algorithm
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 7387

Search results for: elliptic curve digital signature algorithm

5227 Developing Digital Competencies in Aboriginal Students through University-College Partnerships

Authors: W. S. Barber, S. L. King

Abstract:

This paper reports on a pilot project to develop a collaborative partnership between a community college in rural northern Ontario, Canada, and an urban university in the greater Toronto area in Oshawa, Canada. Partner institutions will collaborate to address learning needs of university applicants whose goals are to attain an undergraduate university BA in Educational Studies and Digital Technology degree, but who may not live in a geographical location that would facilitate this pathways process. The UOIT BA degree is attained through a 2+2 program, where students with a 2 year college diploma or equivalent can attain a four year undergraduate degree. The goals reported on the project are as: 1. Our aim is to expand the BA program to include an additional stream which includes serious educational games, simulations and virtual environments, 2. Develop fully (using both synchronous and asynchronous technologies) online learning modules for use by university applicants who otherwise are not geographically located close to a physical university site, 3. Assess the digital competencies of all students, including members of local, distance and Indigenous communities using a validated tool developed and tested by UOIT across numerous populations. This tool, the General Technical Competency Use and Scale (GTCU) will provide the collaborating institutions with data that will allow for analyzing how well students are prepared to succeed in fully online learning communities. Philosophically, the UOIT BA program is based on a fully online learning communities model (FOLC) that can be accessed from anywhere in the world through digital learning environments via audio video conferencing tools such as Adobe Connect. It also follows models of adult learning and mobile learning, and makes a university degree accessible to the increasing demographic of adult learners who may use mobile devices to learn anywhere anytime. The program is based on key principles of Problem Based Learning, allowing students to build their own understandings through the co-design of the learning environment in collaboration with the instructors and their peers. In this way, this degree allows students to personalize and individualize the learning based on their own culture, background and professional/personal experiences. Using modified flipped classroom strategies, students are able to interrogate video modules on their own time in preparation for one hour discussions occurring in video conferencing sessions. As a consequence of the program flexibility, students may continue to work full or part time. All of the partner institutions will co-develop four new modules, administer the GTCU and share data, while creating a new stream of the UOIT BA degree. This will increase accessibility for students to bridge from community colleges to university through a fully digital environment. We aim to work collaboratively with Indigenous elders, community members and distance education instructors to increase opportunities for more students to attain a university education.

Keywords: aboriginal, college, competencies, digital, universities

Procedia PDF Downloads 215
5226 An Enhanced Hybrid Backoff Technique for Minimizing the Occurrence of Collision in Mobile Ad Hoc Networks

Authors: N. Sabiyath Fatima, R. K. Shanmugasundaram

Abstract:

In Mobile Ad-hoc Networks (MANETS), every node performs both as transmitter and receiver. The existing backoff models do not exactly forecast the performance of the wireless network. Also, the existing models experience elevated packet collisions. Every time a collision happens, the station’s contention window (CW) is doubled till it arrives at the utmost value. The main objective of this paper is to diminish collision by means of contention window Multiplicative Increase Decrease Backoff (CWMIDB) scheme. The intention of rising CW is to shrink the collision possibility by distributing the traffic into an outsized point in time. Within wireless Ad hoc networks, the CWMIDB algorithm dynamically controls the contention window of the nodes experiencing collisions. During packet communication, the backoff counter is evenly selected from the given choice of [0, CW-1]. At this point, CW is recognized as contention window and its significance lies on the amount of unsuccessful transmission that had happened for the packet. On the initial transmission endeavour, CW is put to least amount value (C min), if transmission effort fails, subsequently the value gets doubled, and once more the value is set to least amount on victorious broadcast. CWMIDB is simulated inside NS2 environment and its performance is compared with Binary Exponential Backoff Algorithm. The simulation results show improvement in transmission probability compared to that of the existing backoff algorithm.

Keywords: backoff, contention window, CWMIDB, MANET

Procedia PDF Downloads 277
5225 Insider Theft Detection in Organizations Using Keylogger and Machine Learning

Authors: Shamatha Shetty, Sakshi Dhabadi, Prerana M., Indushree B.

Abstract:

About 66% of firms claim that insider attacks are more likely to happen. The frequency of insider incidents has increased by 47% in the last two years. The goal of this work is to prevent dangerous employee behavior by using keyloggers and the Machine Learning (ML) model. Every keystroke that the user enters is recorded by the keylogging program, also known as keystroke logging. Keyloggers are used to stop improper use of the system. This enables us to collect all textual data, save it in a CSV file, and analyze it using an ML algorithm and the VirusTotal API. Many large companies use it to methodically monitor how their employees use computers, the internet, and email. We are utilizing the SVM algorithm and the VirusTotal API to improve overall efficiency and accuracy in identifying specific patterns and words to automate and offer the report for improved monitoring.

Keywords: cyber security, machine learning, cyclic process, email notification

Procedia PDF Downloads 57
5224 An Automatic Method for Building Learners’ Groups in Virtual Environment

Authors: O. Bourkoukou, Essaid El Bachari

Abstract:

The group composing is one of the key issue in collaborative learning to achieve a positive educational experience. The goal of this work is to propose for teachers and tutors a method to create effective collaborative learning groups in e-learning environment based on the learner profile. For this purpose, a new function was defined to rate implicitly learning objects used by the learner during his learning experience. This paper describes the proposed algorithm to build an adequate collaborative learning group. In order to verify the performance of the proposed algorithm, several experiments were conducted in real data set in virtual environment. Results show the effectiveness of the method for which it appears that the proposed approach may be promising to produce better outcomes.

Keywords: building groups, collaborative learning, e-learning, learning objects

Procedia PDF Downloads 297
5223 Mathematical Model to Quantify the Phenomenon of Democracy

Authors: Mechlouch Ridha Fethi

Abstract:

This paper presents a recent mathematical model in political sciences concerning democracy. The model is represented by a logarithmic equation linking the Relative Index of Democracy (RID) to Participation Ratio (PR). Firstly the meanings of the different parameters of the model were presented; and the variation curve of the RID according to PR with different critical areas was discussed. Secondly, the model was applied to a virtual group where we show that the model can be applied depending on the gender. Thirdly, it was observed that the model can be extended to different language models of democracy and that little use to assess the state of democracy for some International organizations like UNO.

Keywords: democracy, mathematic, modelization, quantification

Procedia PDF Downloads 369
5222 [Keynote] Implementation of Quality Control Procedures in Radiotherapy CT Simulator

Authors: B. Petrović, L. Rutonjski, M. Baucal, M. Teodorović, O. Čudić, B. Basarić

Abstract:

Purpose/Objective: Radiotherapy treatment planning requires use of CT simulator, in order to acquire CT images. The overall performance of CT simulator determines the quality of radiotherapy treatment plan, and at the end, the outcome of treatment for every single patient. Therefore, it is strongly advised by international recommendations, to set up a quality control procedures for every machine involved in radiotherapy treatment planning process, including the CT scanner/ simulator. The overall process requires number of tests, which are used on daily, weekly, monthly or yearly basis, depending on the feature tested. Materials/Methods: Two phantoms were used: a dedicated phantom CIRS 062QA, and a QA phantom obtained with the CT simulator. The examined CT simulator was Siemens Somatom Definition as Open, dedicated for radiation therapy treatment planning. The CT simulator has a built in software, which enables fast and simple evaluation of CT QA parameters, using the phantom provided with the CT simulator. On the other hand, recommendations contain additional test, which were done with the CIRS phantom. Also, legislation on ionizing radiation protection requires CT testing in defined periods of time. Taking into account the requirements of law, built in tests of a CT simulator, and international recommendations, the intitutional QC programme for CT imulator is defined, and implemented. Results: The CT simulator parameters evaluated through the study were following: CT number accuracy, field uniformity, complete CT to ED conversion curve, spatial and contrast resolution, image noise, slice thickness, and patient table stability.The following limits are established and implemented: CT number accuracy limits are +/- 5 HU of the value at the comissioning. Field uniformity: +/- 10 HU in selected ROIs. Complete CT to ED curve for each tube voltage must comply with the curve obtained at comissioning, with deviations of not more than 5%. Spatial and contrast resultion tests must comply with the tests obtained at comissioning, otherwise machine requires service. Result of image noise test must fall within the limit of 20% difference of the base value. Slice thickness must meet manufacturer specifications, and patient stability with longitudinal transfer of loaded table must not differ of more than 2mm vertical deviation. Conclusion: The implemented QA tests gave overall basic understanding of CT simulator functionality and its clinical effectiveness in radiation treatment planning. The legal requirement to the clinic is to set up it’s own QA programme, with minimum testing, but it remains user’s decision whether additional testing, as recommended by international organizations, will be implemented, so to improve the overall quality of radiation treatment planning procedure, as the CT image quality used for radiation treatment planning, influences the delineation of a tumor and calculation accuracy of treatment planning system, and finally delivery of radiation treatment to a patient.

Keywords: CT simulator, radiotherapy, quality control, QA programme

Procedia PDF Downloads 534
5221 Artificial Bee Colony Optimization for SNR Maximization through Relay Selection in Underlay Cognitive Radio Networks

Authors: Babar Sultan, Kiran Sultan, Waseem Khan, Ijaz Mansoor Qureshi

Abstract:

In this paper, a novel idea for the performance enhancement of secondary network is proposed for Underlay Cognitive Radio Networks (CRNs). In Underlay CRNs, primary users (PUs) impose strict interference constraints on the secondary users (SUs). The proposed scheme is based on Artificial Bee Colony (ABC) optimization for relay selection and power allocation to handle the highlighted primary challenge of Underlay CRNs. ABC is a simple, population-based optimization algorithm which attains global optimum solution by combining local search methods (Employed and Onlooker Bees) and global search methods (Scout Bees). The proposed two-phase relay selection and power allocation algorithm aims to maximize the signal-to-noise ratio (SNR) at the destination while operating in an underlying mode. The proposed algorithm has less computational complexity and its performance is verified through simulation results for a different number of potential relays, different interference threshold levels and different transmit power thresholds for the selected relays.

Keywords: artificial bee colony, underlay spectrum sharing, cognitive radio networks, amplify-and-forward

Procedia PDF Downloads 581
5220 The Impact of Digitalization and Sustainability on Professionals’ Performance in the Built Environment in Nigeria

Authors: Taiwo, Richard Oluseyi, Morakinyo, Kolawole O., Oyeniran, Demilade O.

Abstract:

This study examines the effects of digitalization and sustainability on professionals' performance within the built environment. By examining the interplay between these two transformative forces, the study seeks to unravel the complexities and opportunities presented by digital technologies in fostering sustainable practices across various professional disciplines. Through an extensive analysis of literature and expert interviews, this research explores how digitalization can enhance professionals' abilities to incorporate sustainability principles, optimize resource utilization, and promote resilient and inclusive built environments. Furthermore, it examines the challenges and barriers professionals face in adapting to and harnessing the potential of digital tools and processes. The findings will contribute to a greater comprehension of the beneficial interactions between digitalization and sustainable development and provide valuable insights for policymakers, practitioners, and educators in fostering an ecosystem that supports professionals' capacity building, collaboration, and innovation toward achieving sustainable goals in the built environment.

Keywords: digitisation, sustainability, professional performance, built environment

Procedia PDF Downloads 32
5219 Analysis of Knowledge Circulation in Digital Learning Environments: A Case Study of the MOOC 'Communication des Organisations'

Authors: Hasna Mekkaoui Alaoui, Mariem Mekkaoui Alaoui

Abstract:

In a context marked by a growing and pressing demand for online training within Moroccan universities, massive open online courses (Moocs) are undergoing constant evolution, amplified by the widespread use of digital technology and accentuated by the Coronavirus pandemic. However, despite their growing popularity and expansion, these courses are still lacking in terms of tools, enabling teachers and researchers to carry out a fine-grained analysis of the learning processes taking place within them. What's more, the circulation and sharing of knowledge within these environments is becoming increasingly important. The crucial aspect of traceability emerges here, as MOOCs record and generate traces from the most minute to the most visible. This leads us to consider traceability as a valuable approach in the field of educational research, where the trace is envisaged as a research tool in its own right. In this exploratory research project, we are looking at aspects of community knowledge sharing based on traces observed in the "Communication des organisations" Mooc. Focusing in particular on the mediating trace and its impact in identifying knowledge circulation processes in this learning space, we have mobilized the traces of video capsules as an index of knowledge circulation in the Mooc device. Our study uses a methodological approach based on thematic analysis, and although the results show that learners reproduce knowledge from different video vignettes in almost identical ways, they do not limit themselves to the knowledge provided to them. This research offers concrete perspectives for improving the dynamics of online devices, with a potentially positive impact on the quality of online university teaching.

Keywords: circulation, index, digital environments, mediation., trace

Procedia PDF Downloads 64
5218 Digital Immunity System for Healthcare Data Security

Authors: Nihar Bheda

Abstract:

Protecting digital assets such as networks, systems, and data from advanced cyber threats is the aim of Digital Immunity Systems (DIS), which are a subset of cybersecurity. With features like continuous monitoring, coordinated reactions, and long-term adaptation, DIS seeks to mimic biological immunity. This minimizes downtime by automatically identifying and eliminating threats. Traditional security measures, such as firewalls and antivirus software, are insufficient for enterprises, such as healthcare providers, given the rapid evolution of cyber threats. The number of medical record breaches that have occurred in recent years is proof that attackers are finding healthcare data to be an increasingly valuable target. However, obstacles to enhancing security include outdated systems, financial limitations, and a lack of knowledge. DIS is an advancement in cyber defenses designed specifically for healthcare settings. Protection akin to an "immune system" is produced by core capabilities such as anomaly detection, access controls, and policy enforcement. Coordination of responses across IT infrastructure to contain attacks is made possible by automation and orchestration. Massive amounts of data are analyzed by AI and machine learning to find new threats. After an incident, self-healing enables services to resume quickly. The implementation of DIS is consistent with the healthcare industry's urgent requirement for resilient data security in light of evolving risks and strict guidelines. With resilient systems, it can help organizations lower business risk, minimize the effects of breaches, and preserve patient care continuity. DIS will be essential for protecting a variety of environments, including cloud computing and the Internet of medical devices, as healthcare providers quickly adopt new technologies. DIS lowers traditional security overhead for IT departments and offers automated protection, even though it requires an initial investment. In the near future, DIS may prove to be essential for small clinics, blood banks, imaging centers, large hospitals, and other healthcare organizations. Cyber resilience can become attainable for the whole healthcare ecosystem with customized DIS implementations.

Keywords: digital immunity system, cybersecurity, healthcare data, emerging technology

Procedia PDF Downloads 67
5217 Relevance Feedback within CBIR Systems

Authors: Mawloud Mosbah, Bachir Boucheham

Abstract:

We present here the results for a comparative study of some techniques, available in the literature, related to the relevance feedback mechanism in the case of a short-term learning. Only one method among those considered here is belonging to the data mining field which is the K-Nearest Neighbours Algorithm (KNN) while the rest of the methods is related purely to the information retrieval field and they fall under the purview of the following three major axes: Shifting query, Feature Weighting and the optimization of the parameters of similarity metric. As a contribution, and in addition to the comparative purpose, we propose a new version of the KNN algorithm referred to as an incremental KNN which is distinct from the original version in the sense that besides the influence of the seeds, the rate of the actual target image is influenced also by the images already rated. The results presented here have been obtained after experiments conducted on the Wang database for one iteration and utilizing colour moments on the RGB space. This compact descriptor, Colour Moments, is adequate for the efficiency purposes needed in the case of interactive systems. The results obtained allow us to claim that the proposed algorithm proves good results; it even outperforms a wide range of techniques available in the literature.

Keywords: CBIR, category search, relevance feedback, query point movement, standard Rocchio’s formula, adaptive shifting query, feature weighting, original KNN, incremental KNN

Procedia PDF Downloads 280
5216 Normalizing Scientometric Indicators of Individual Publications Using Local Cluster Detection Methods on Citation Networks

Authors: Levente Varga, Dávid Deritei, Mária Ercsey-Ravasz, Răzvan Florian, Zsolt I. Lázár, István Papp, Ferenc Járai-Szabó

Abstract:

One of the major shortcomings of widely used scientometric indicators is that different disciplines cannot be compared with each other. The issue of cross-disciplinary normalization has been long discussed, but even the classification of publications into scientific domains poses problems. Structural properties of citation networks offer new possibilities, however, the large size and constant growth of these networks asks for precaution. Here we present a new tool that in order to perform cross-field normalization of scientometric indicators of individual publications relays on the structural properties of citation networks. Due to the large size of the networks, a systematic procedure for identifying scientific domains based on a local community detection algorithm is proposed. The algorithm is tested with different benchmark and real-world networks. Then, by the use of this algorithm, the mechanism of the scientometric indicator normalization process is shown for a few indicators like the citation number, P-index and a local version of the PageRank indicator. The fat-tail trend of the article indicator distribution enables us to successfully perform the indicator normalization process.

Keywords: citation networks, cross-field normalization, local cluster detection, scientometric indicators

Procedia PDF Downloads 204
5215 Novel GPU Approach in Predicting the Directional Trend of the S&P500

Authors: A. J. Regan, F. J. Lidgey, M. Betteridge, P. Georgiou, C. Toumazou, K. Hayatleh, J. R. Dibble

Abstract:

Our goal is development of an algorithm capable of predicting the directional trend of the Standard and Poor’s 500 index (S&P 500). Extensive research has been published attempting to predict different financial markets using historical data testing on an in-sample and trend basis, with many authors employing excessively complex mathematical techniques. In reviewing and evaluating these in-sample methodologies, it became evident that this approach was unable to achieve sufficiently reliable prediction performance for commercial exploitation. For these reasons, we moved to an out-of-sample strategy based on linear regression analysis of an extensive set of financial data correlated with historical closing prices of the S&P 500. We are pleased to report a directional trend accuracy of greater than 55% for tomorrow (t+1) in predicting the S&P 500.

Keywords: financial algorithm, GPU, S&P 500, stock market prediction

Procedia PDF Downloads 350
5214 Riemannain Geometries Of Visual Space

Authors: Jacek Turski

Abstract:

The visual space geometries are constructed in the Riemannian geometry framework from simulated iso-disparity conics in the horizontalvisual plane of the binocular system with the asymmetric eyes (AEs). For the eyes fixating at the abathic distance, which depends on the AE’s parameters, the iso-disparity conics are frontal straight lines in physical space. For allother fixations, the iso-disparity conics consist of families of the ellipses or hyperbolas depending on both the AE’s parameters and the bifoveal fixation. However, the iso-disparity conic’s arcs are perceived in the gaze direction asthe frontal lines and are referred to as visual geodesics. Thus, geometriesof physical and visual spaces are different. A simple postulate that combines simulated iso-disparity conics with basic anatomy od the human visual system gives the relative depth for the fixation at the abathic distance that establishes the Riemann matric tensor. The resulting geodesics are incomplete in the gaze direction and, therefore, give thefinite distances to the horizon that depend on the AE’s parameters. Moreover, the curvature vanishes in this eyes posture such that visual space is flat. For all other fixations, only the sign of the curvature canbe inferred from the global behavior of the simulated iso-disparity conics: the curvature is positive for the elliptic iso-disparity curves and negative for the hyperbolic iso-disparity curves.

Keywords: asymmetric eye model, iso-disparity conics, metric tensor, geodesics, curvature

Procedia PDF Downloads 145
5213 AI-based Digital Healthcare Application to Assess and Reduce Fall Risks in Residents of Nursing Homes in Germany

Authors: Knol Hester, Müller Swantje, Danchenko Natalya

Abstract:

Objective: Falls in older people cause an autonomy loss and result in an economic burden. LCare is an AI-based application to manage fall risks. The study's aim was to assess the effect of LCare use on patient outcomes in nursing homes in Germany. Methods: LCare identifies and monitors fall risks through a 3D-gait analysis and a digital questionnaire, resulting in tailored recommendations on fall prevention. A study was conducted with AOK Baden-Württemberg (01.09.2019- 31.05.2021) in 16 care facilities. Assessments at baseline and follow-up included: a fall risk score; falls (baseline: fall history in the past 12 months; follow-up: a fall record since the last analysis); fall-related injuries and hospitalizations; gait speed; fear of falling; psychological stress; nurses experience on app use. Results: 94 seniors were aged 65-99 years at the initial analysis (average 84±7 years); 566 mobility analyses were carried out in total. On average, the fall risk was reduced by 17.8 % as compared to the baseline (p<0.05). The risk of falling decreased across all subgroups, including a trend in dementia patients (p=0.06), constituting 43% of analyzed patients, and patients with walking aids (p<0.05), constituting 76% of analyzed patients. There was a trend (p<0.1) towards fewer falls and fall-related injuries and hospitalizations (baseline: 23 seniors who fell, 13 injury consequences, 9 hospitalizations; follow-up: 14 seniors who fell, 2 injury consequences, 0 hospitalizations). There was a 16% improvement in gait speed (p<0.05). Residents reported less fear of falling and psychological stress by 38% in both outcomes (p<0.05). 81% of nurses found LCare effective. Conclusions: In the presented study, the use of LCare app was associated with a reduction of fall risk among nursing home residents, improvement of health-related outcomes, and a trend toward reduction in injuries and hospitalizations. LCare may help to improve senior resident care and save healthcare costs.

Keywords: falls, digital healthcare, falls prevention, nursing homes, seniors, AI, digital assessment

Procedia PDF Downloads 131
5212 Phasor Measurement Unit Based on Particle Filtering

Authors: Rithvik Reddy Adapa, Xin Wang

Abstract:

Phasor Measurement Units (PMUs) are very sophisticated measuring devices that find amplitude, phase and frequency of various voltages and currents in a power system. Particle filter is a state estimation technique that uses Bayesian inference. Particle filters are widely used in pose estimation and indoor navigation and are very reliable. This paper studies and compares four different particle filters as PMUs namely, generic particle filter (GPF), genetic algorithm particle filter (GAPF), particle swarm optimization particle filter (PSOPF) and adaptive particle filter (APF). Two different test signals are used to test the performance of the filters in terms of responsiveness and correctness of the estimates.

Keywords: phasor measurement unit, particle filter, genetic algorithm, particle swarm optimisation, state estimation

Procedia PDF Downloads 9
5211 A Study on the Effect of Different Climate Conditions on Time of Balance of Bleeding and Evaporation in Plastic Shrinkage Cracking of Concrete Pavements

Authors: Hasan Ziari, Hassan Fazaeli, Seyed Javad Vaziri Kang Olyaei, Asma Sadat Dabiri

Abstract:

The presence of cracks in concrete pavements is a place for the ingression of corrosive substances, acids, oils, and water into the pavement and reduces its long-term durability and level of service. One of the causes of early cracks in concrete pavements is the plastic shrinkage. This shrinkage occurs due to the formation of negative capillary pressures after the equilibrium of the bleeding and evaporation rates at the pavement surface. These cracks form if the tensile stresses caused by the restrained shrinkage exceed the tensile strength of the concrete. Different climate conditions change the rate of evaporation and thus change the balance time of the bleeding and evaporation, which changes the severity of cracking in concrete. The present study examined the relationship between the balance time of bleeding and evaporation and the area of cracking in the concrete slabs using the standard method ASTM C1579 in 27 different environmental conditions by using continuous video recording and digital image analyzing. The results showed that as the evaporation rate increased and the balance time decreased, the crack severity significantly increased so that by reducing the balance time from the maximum value to its minimum value, the cracking area increased more than four times. It was also observed that the cracking area- balance time curve could be interpreted in three sections. An examination of these three parts showed that the combination of climate conditions has a significant effect on increasing or decreasing these two variables. The criticality of a single factor cannot cause the critical conditions of plastic cracking. By combining two mild environmental factors with a severe climate factor (in terms of surface evaporation rate), a considerable reduction in balance time and a sharp increase in cracking severity can be prevented. The results of this study showed that balance time could be an essential factor in controlling and predicting plastic shrinkage cracking in concrete pavements. It is necessary to control this factor in the case of constructing concrete pavements in different climate conditions.

Keywords: bleeding and cracking severity, concrete pavements, climate conditions, plastic shrinkage

Procedia PDF Downloads 146
5210 Transmit Power Optimization for Cooperative Beamforming in Reverse-Link MIMO Ad-Hoc Networks

Authors: Younghyun Jeon, Seungjoo Maeng

Abstract:

In the Ad-hoc network, the great interests regarding MIMO scheme leads to their combination, which is also utilized into its applicable network. We manage the field of the problem into Reverse-link MIMO Ad-hoc Network (RMAN) and propose the methodology to maximize the data rate with its power consumption using Node-Cooperative beamforming technique. Based on the result of mathematical optimization formulation, we design the algorithm to construct optimal orthogonal weight vector according to channel feedback and control its transmission power according to QoS-pricing value level. In simulation results, we show the validity of the proposed mathematical optimization result and algorithm which mean that the sum-rate of each link is converged into some point.

Keywords: ad-hoc network, MIMO, cooperative beamforming, transmit power

Procedia PDF Downloads 398
5209 The Morphing Avatar of Startup Sales - Destination Virtual Reality

Authors: Sruthi Kannan

Abstract:

The ongoing covid pandemic has accelerated digital transformation like never before. The physical barriers brought in as a result of the pandemic are being bridged by digital alternatives. While basic collaborative activities like voice, video calling, screen sharing have been replicated in these alternatives, there are several others that require a more intimate setup. Pitching, showcasing, and providing demonstrations are an integral part of selling strategies for startups. Traditionally these have been in-person engagements, enabling a depth of understanding of the startups’ offerings. In the new normal scenario of virtual-only connects, startups are feeling the brunt of the lack of in-person connections with potential customers and investors. This poster demonstrates how a virtual reality platform has been conceptualized and custom-built for startups to engage with their stakeholders and redefine their selling strategies. This virtual reality platform is intended to provide an immersive experience for startup showcases and offers the nearest possible alternative to physical meetings for the startup ecosystem, thereby opening newer frontiers for entrepreneurial collaborations.

Keywords: collaboration, sales, startups, strategy, virtual reality

Procedia PDF Downloads 306
5208 Evaluation of Digital Marketing Strategies by Behavioral Economics

Authors: Sajjad Esmaeili Aghdam

Abstract:

Economics typically conceptualizes individual behavior as the consequence of external states, for example, budgets and prices (or respective beliefs) and choices. As the main goal, we focus on the influence of a range of Behavioral Economics factors on Strategies of Digital Marketing, evaluation of strategies and deformation of it into highly prospective marketing strategies. The different forms of behavioral prospects all lead to the succeeding two main results. First, the steadiness of the economic dynamics in a currency union be contingent fatefully on the level of economic incorporation. More economic incorporation leads to more steady economic dynamics. Electronic word-of-mouth (eWOM) is “all casual communications focused at consumers through Internet-based technology connected to the usage or characteristics of specific properties and services or their venders.” eWOM can take many methods, the most significant one being online analyses. Writing this paper, 72 articles have been gathered, focusing on the title and the aim of the article from research search engines like Google Scholar, Web of Science, and PubMed. Recent research in strategic management and marketing proposes that markets should not be viewed as a given and deterministic setting, exogenous to the firm. Instead, firms are progressively abstracted as dynamic inventors of market prospects. The use of new technologies touches all spheres of the modern lifestyle. Social and economic life becomes unbearable without fast, applicable, first-class and fitting material. Psychology and economics (together known as behavioral economics) are two protruding disciplines underlying many theories in marketing. The wide marketing works papers consumers’ none balanced behavior even though behavioral biases might not continuously be steadily called or officially labeled.

Keywords: behavioral economics, digital marketing, marketing strategy, high impact strategies

Procedia PDF Downloads 183
5207 Secret Sharing in Visual Cryptography Using NVSS and Data Hiding Techniques

Authors: Misha Alexander, S. B. Waykar

Abstract:

Visual Cryptography is a special unbreakable encryption technique that transforms the secret image into random noisy pixels. These shares are transmitted over the network and because of its noisy texture it attracts the hackers. To address this issue a Natural Visual Secret Sharing Scheme (NVSS) was introduced that uses natural shares either in digital or printed form to generate the noisy secret share. This scheme greatly reduces the transmission risk but causes distortion in the retrieved secret image through variation in settings and properties of digital devices used to capture the natural image during encryption / decryption phase. This paper proposes a new NVSS scheme that extracts the secret key from randomly selected unaltered multiple natural images. To further improve the security of the shares data hiding techniques such as Steganography and Alpha channel watermarking are proposed.

Keywords: decryption, encryption, natural visual secret sharing, natural images, noisy share, pixel swapping

Procedia PDF Downloads 405
5206 A Use Case-Oriented Performance Measurement Framework for AI and Big Data Solutions in the Banking Sector

Authors: Yassine Bouzouita, Oumaima Belghith, Cyrine Zitoun, Charles Bonneau

Abstract:

Performance measurement framework (PMF) is an essential tool in any organization to assess the performance of its processes. It guides businesses to stay on track with their objectives and benchmark themselves from the market. With the growing trend of the digital transformation of business processes, led by innovations in artificial intelligence (AI) & Big Data applications, developing a mature system capable of capturing the impact of digital solutions across different industries became a necessity. Based on the conducted research, no such system has been developed in academia nor the industry. In this context, this paper covers a variety of methodologies on performance measurement, overviews the major AI and big data applications in the banking sector, and covers an exhaustive list of relevant metrics. Consequently, this paper is of interest to both researchers and practitioners. From an academic perspective, it offers a comparative analysis of the reviewed performance measurement frameworks. From an industry perspective, it offers exhaustive research, from market leaders, of the major applications of AI and Big Data technologies, across the different departments of an organization. Moreover, it suggests a standardized classification model with a well-defined structure of intelligent digital solutions. The aforementioned classification is mapped to a centralized library that contains an indexed collection of potential metrics for each application. This library is arranged in a manner that facilitates the rapid search and retrieval of relevant metrics. This proposed framework is meant to guide professionals in identifying the most appropriate AI and big data applications that should be adopted. Furthermore, it will help them meet their business objectives through understanding the potential impact of such solutions on the entire organization.

Keywords: AI and Big Data applications, impact assessment, metrics, performance measurement

Procedia PDF Downloads 198
5205 Distributed Optical Fiber Vibration Sensing Using Phase Generated Carrier Demodulation Algorithm

Authors: Zhihua Yu, Qi Zhang, Mingyu Zhang, Haolong Dai

Abstract:

Distributed fiber-optic vibration sensors are gaining extensive attention, for the advantages of high sensitivity, accurate location, light weight, large-scale monitoring, good concealment, and etc. In this paper, a novel optical fiber distributed vibration sensing system is proposed, which is based on self-interference of Rayleigh backscattering with phase generated carrier (PGC) demodulation algorithm. Pulsed lights are sent into the sensing fiber and the Rayleigh backscattering light from a certain position along the sensing fiber would interfere through an unbalanced Michelson Interferometry (MI) to generate the interference light. An improved PGC demodulation algorithm is carried out to recover the phase information of the interference signal, which carries the sensing information. Three vibration events were applied simultaneously to different positions over 2000m sensing fiber and demodulated correctly. Experiments show that the spatial resolution of is 10 m, and the noise level of the Φ-OTDR system is about 10-3 rad/√Hz, and the signal to noise ratio (SNR) is about 30.34dB. This vibration measurement scheme can be applied at surface, seabed or downhole for vibration measurements or distributed acoustic sensing (DAS).

Keywords: fiber optics sensors, Michelson interferometry, MI, phase-sensitive optical time domain reflectometry, Φ-OTDR, phase generated carrier, PGC

Procedia PDF Downloads 190
5204 Factors Influencing Consumer Adoption of Digital Banking Apps in the UK

Authors: Sevelina Ndlovu

Abstract:

Financial Technology (fintech) advancement is recognised as one of the most transformational innovations in the financial industry. Fintech has given rise to internet-only digital banking, a novel financial technology advancement, and innovation that allows banking services through internet applications with no need for physical branches. This technology is becoming a new banking normal among consumers for its ubiquitous and real-time access advantages. There is evident switching and migration from traditional banking towards these fintech facilities, which could possibly pose a systemic risk if not properly understood and monitored. Fintech advancement has also brought about the emergence and escalation of financial technology consumption themes such as trust, security, perceived risk, and sustainability within the banking industry, themes scarcely covered in existing theoretic literature. To that end, the objective of this research is to investigate factors that determine fintech adoption and propose an integrated adoption model. This study aims to establish what the significant drivers of adoption are and develop a conceptual model that integrates technological, behavioral, and environmental constructs by extending the Unified Theory of Acceptance and Use of Technology 2 (UTAUT2). It proposes integrating constructs that influence financial consumption themes such as trust, perceived risk, security, financial incentives, micro-investing opportunities, and environmental consciousness to determine the impact of these factors on the adoption and intention to use digital banking apps. The main advantage of this conceptual model is the consolidation of a greater number of predictor variables that can provide a fuller explanation of the consumer's adoption of digital banking Apps. Moderating variables of age, gender, and income are incorporated. To the best of author’s knowledge, this study is the first that extends the UTAUT2 model with this combination of constructs to investigate user’s intention to adopt internet-only digital banking apps in the UK context. By investigating factors that are not included in the existing theories but are highly pertinent to the adoption of internet-only banking services, this research adds to existing knowledge and extends the generalisability of the UTAUT2 in a financial services adoption context. This is something that fills a gap in knowledge, as highlighted to needing further research on UTAUT2 after reviewing the theory in 2016 from its original version of 2003. To achieve the objectives of this study, this research assumes a quantitative research approach to empirically test the hypotheses derived from existing literature and pilot studies to give statistical support to generalise the research findings for further possible applications in theory and practice. This research is explanatory or casual in nature and uses cross-section primary data collected through a survey method. Convenient and purposive sampling using structured self-administered online questionnaires is used for data collection. The proposed model is tested using Structural Equation Modelling (SEM), and the analysis of primary data collected through an online survey is processed using Smart PLS software with a sample size of 386 digital bank users. The results are expected to establish if there are significant relationships between the dependent and independent variables and establish what the most influencing factors are.

Keywords: banking applications, digital banking, financial technology, technology adoption, UTAUT2

Procedia PDF Downloads 72
5203 Reliability Analysis of Geometric Performance of Onboard Satellite Sensors: A Study on Location Accuracy

Authors: Ch. Sridevi, A. Chalapathi Rao, P. Srinivasulu

Abstract:

The location accuracy of data products is a critical parameter in assessing the geometric performance of satellite sensors. This study focuses on reliability analysis of onboard sensors to evaluate their performance in terms of location accuracy performance over time. The analysis utilizes field failure data and employs the weibull distribution to determine the reliability and in turn to understand the improvements or degradations over a period of time. The analysis begins by scrutinizing the location accuracy error which is the root mean square (RMS) error of differences between ground control point coordinates observed on the product and the map and identifying the failure data with reference to time. A significant challenge in this study is to thoroughly analyze the possibility of an infant mortality phase in the data. To address this, the Weibull distribution is utilized to determine if the data exhibits an infant stage or if it has transitioned into the operational phase. The shape parameter beta plays a crucial role in identifying this stage. Additionally, determining the exact start of the operational phase and the end of the infant stage poses another challenge as it is crucial to eliminate residual infant mortality or wear-out from the model, as it can significantly increase the total failure rate. To address this, an approach utilizing the well-established statistical Laplace test is applied to infer the behavior of sensors and to accurately ascertain the duration of different phases in the lifetime and the time required for stabilization. This approach also helps in understanding if the bathtub curve model, which accounts for the different phases in the lifetime of a product, is appropriate for the data and whether the thresholds for the infant period and wear-out phase are accurately estimated by validating the data in individual phases with Weibull distribution curve fitting analysis. Once the operational phase is determined, reliability is assessed using Weibull analysis. This analysis not only provides insights into the reliability of individual sensors with regards to location accuracy over the required period of time, but also establishes a model that can be applied to automate similar analyses for various sensors and parameters using field failure data. Furthermore, the identification of the best-performing sensor through this analysis serves as a benchmark for future missions and designs, ensuring continuous improvement in sensor performance and reliability. Overall, this study provides a methodology to accurately determine the duration of different phases in the life data of individual sensors. It enables an assessment of the time required for stabilization and provides insights into the reliability during the operational phase and the commencement of the wear-out phase. By employing this methodology, designers can make informed decisions regarding sensor performance with regards to location accuracy, contributing to enhanced accuracy in satellite-based applications.

Keywords: bathtub curve, geometric performance, Laplace test, location accuracy, reliability analysis, Weibull analysis

Procedia PDF Downloads 65
5202 Pharmacokinetic Modeling of Valsartan in Dog following a Single Oral Administration

Authors: In-Hwan Baek

Abstract:

Valsartan is a potent and highly selective antagonist of the angiotensin II type 1 receptor, and is widely used for the treatment of hypertension. The aim of this study was to investigate the pharmacokinetic properties of the valsartan in dogs following oral administration of a single dose using quantitative modeling approaches. Forty beagle dogs were randomly divided into two group. Group A (n=20) was administered a single oral dose of valsartan 80 mg (Diovan® 80 mg), and group B (n=20) was administered a single oral dose of valsartan 160 mg (Diovan® 160 mg) in the morning after an overnight fast. Blood samples were collected into heparinized tubes before and at 0.5, 1, 1.5, 2, 2.5, 3, 4, 6, 8, 12 and 24 h following oral administration. The plasma concentrations of the valsartan were determined using LC-MS/MS. Non-compartmental pharmacokinetic analyses were performed using WinNonlin Standard Edition software, and modeling approaches were performed using maximum-likelihood estimation via the expectation maximization (MLEM) algorithm with sampling using ADAPT 5 software. After a single dose of valsartan 80 mg, the mean value of maximum concentration (Cmax) was 2.68 ± 1.17 μg/mL at 1.83 ± 1.27 h. The area under the plasma concentration-versus-time curve from time zero to the last measurable concentration (AUC24h) value was 13.21 ± 6.88 μg·h/mL. After dosing with valsartan 160 mg, the mean Cmax was 4.13 ± 1.49 μg/mL at 1.80 ± 1.53 h, the AUC24h was 26.02 ± 12.07 μg·h/mL. The Cmax and AUC values increased in proportion to the increment in valsartan dose, while the pharmacokinetic parameters of elimination rate constant, half-life, apparent of total clearance, and apparent of volume of distribution were not significantly different between the doses. Valsartan pharmacokinetic analysis fits a one-compartment model with first-order absorption and elimination following a single dose of valsartan 80 mg and 160 mg. In addition, high inter-individual variability was identified in the absorption rate constant. In conclusion, valsartan displays the dose-dependent pharmacokinetics in dogs, and Subsequent quantitative modeling approaches provided detailed pharmacokinetic information of valsartan. The current findings provide useful information in dogs that will aid future development of improved formulations or fixed-dose combinations.

Keywords: dose-dependent, modeling, pharmacokinetics, valsartan

Procedia PDF Downloads 298
5201 Design and Development of an Algorithm to Predict Fluctuations of Currency Rates

Authors: Nuwan Kuruwitaarachchi, M. K. M. Peiris, C. N. Madawala, K. M. A. R. Perera, V. U. N Perera

Abstract:

Dealing with businesses with the foreign market always took a special place in a country’s economy. Political and social factors came into play making currency rate changes fluctuate rapidly. Currency rate prediction has become an important factor for larger international businesses since large amounts of money exchanged between countries. This research focuses on comparing the accuracy of mainly three models; Autoregressive Integrated Moving Average (ARIMA), Artificial Neural Networks(ANN) and Support Vector Machines(SVM). series of data import, export, USD currency exchange rate respect to LKR has been selected for training using above mentioned algorithms. After training the data set and comparing each algorithm, it was able to see that prediction in SVM performed better than other models. It was improved more by combining SVM and SVR models together.

Keywords: ARIMA, ANN, FFNN, RMSE, SVM, SVR

Procedia PDF Downloads 212
5200 Using the Smith-Waterman Algorithm to Extract Features in the Classification of Obesity Status

Authors: Rosa Figueroa, Christopher Flores

Abstract:

Text categorization is the problem of assigning a new document to a set of predetermined categories, on the basis of a training set of free-text data that contains documents whose category membership is known. To train a classification model, it is necessary to extract characteristics in the form of tokens that facilitate the learning and classification process. In text categorization, the feature extraction process involves the use of word sequences also known as N-grams. In general, it is expected that documents belonging to the same category share similar features. The Smith-Waterman (SW) algorithm is a dynamic programming algorithm that performs a local sequence alignment in order to determine similar regions between two strings or protein sequences. This work explores the use of SW algorithm as an alternative to feature extraction in text categorization. The dataset used for this purpose, contains 2,610 annotated documents with the classes Obese/Non-Obese. This dataset was represented in a matrix form using the Bag of Word approach. The score selected to represent the occurrence of the tokens in each document was the term frequency-inverse document frequency (TF-IDF). In order to extract features for classification, four experiments were conducted: the first experiment used SW to extract features, the second one used unigrams (single word), the third one used bigrams (two word sequence) and the last experiment used a combination of unigrams and bigrams to extract features for classification. To test the effectiveness of the extracted feature set for the four experiments, a Support Vector Machine (SVM) classifier was tuned using 20% of the dataset. The remaining 80% of the dataset together with 5-Fold Cross Validation were used to evaluate and compare the performance of the four experiments of feature extraction. Results from the tuning process suggest that SW performs better than the N-gram based feature extraction. These results were confirmed by using the remaining 80% of the dataset, where SW performed the best (accuracy = 97.10%, weighted average F-measure = 97.07%). The second best was obtained by the combination of unigrams-bigrams (accuracy = 96.04, weighted average F-measure = 95.97) closely followed by the bigrams (accuracy = 94.56%, weighted average F-measure = 94.46%) and finally unigrams (accuracy = 92.96%, weighted average F-measure = 92.90%).

Keywords: comorbidities, machine learning, obesity, Smith-Waterman algorithm

Procedia PDF Downloads 297
5199 Using ε Value in Describe Regular Languages by Using Finite Automata, Operation on Languages and the Changing Algorithm Implementation

Authors: Abdulmajid Mukhtar Afat

Abstract:

This paper aims at introducing nondeterministic finite automata with ε value which is used to perform some operations on languages. a program is created to implement the algorithm that converts nondeterministic finite automata with ε value (ε-NFA) to deterministic finite automata (DFA).The program is written in c++ programming language. The program inputs are FA 5-tuples from text file and then classifies it into either DFA/NFA or ε -NFA. For DFA, the program will get the string w and decide whether it is accepted or rejected. The tracking path for an accepted string is saved by the program. In case of NFA or ε-NFA automation, the program changes the automation to DFA to enable tracking and to decide if the string w exists in the regular language or not.

Keywords: DFA, NFA, ε-NFA, eclose, finite automata, operations on languages

Procedia PDF Downloads 489
5198 Probability-Based Damage Detection of Structures Using Kriging Surrogates and Enhanced Ideal Gas Molecular Movement Algorithm

Authors: M. R. Ghasemi, R. Ghiasi, H. Varaee

Abstract:

Surrogate model has received increasing attention for use in detecting damage of structures based on vibration modal parameters. However, uncertainties existing in the measured vibration data may lead to false or unreliable output result from such model. In this study, an efficient approach based on Monte Carlo simulation is proposed to take into account the effect of uncertainties in developing a surrogate model. The probability of damage existence (PDE) is calculated based on the probability density function of the existence of undamaged and damaged states. The kriging technique allows one to genuinely quantify the surrogate error, therefore it is chosen as metamodeling technique. Enhanced version of ideal gas molecular movement (EIGMM) algorithm is used as main algorithm for model updating. The developed approach is applied to detect simulated damage in numerical models of 72-bar space truss and 120-bar dome truss. The simulation results show the proposed method can perform well in probability-based damage detection of structures with less computational effort compared to direct finite element model.

Keywords: probability-based damage detection (PBDD), Kriging, surrogate modeling, uncertainty quantification, artificial intelligence, enhanced ideal gas molecular movement (EIGMM)

Procedia PDF Downloads 240