Search results for: elliptic curve digital signature algorithm
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 7385

Search results for: elliptic curve digital signature algorithm

3275 Assessing Effects of an Intervention on Bottle-Weaning and Reducing Daily Milk Intake from Bottles in Toddlers Using Two-Part Random Effects Models

Authors: Yungtai Lo

Abstract:

Two-part random effects models have been used to fit semi-continuous longitudinal data where the response variable has a point mass at 0 and a continuous right-skewed distribution for positive values. We review methods proposed in the literature for analyzing data with excess zeros. A two-part logit-log-normal random effects model, a two-part logit-truncated normal random effects model, a two-part logit-gamma random effects model, and a two-part logit-skew normal random effects model were used to examine effects of a bottle-weaning intervention on reducing bottle use and daily milk intake from bottles in toddlers aged 11 to 13 months in a randomized controlled trial. We show in all four two-part models that the intervention promoted bottle-weaning and reduced daily milk intake from bottles in toddlers drinking from a bottle. We also show that there are no differences in model fit using either the logit link function or the probit link function for modeling the probability of bottle-weaning in all four models. Furthermore, prediction accuracy of the logit or probit link function is not sensitive to the distribution assumption on daily milk intake from bottles in toddlers not off bottles.

Keywords: two-part model, semi-continuous variable, truncated normal, gamma regression, skew normal, Pearson residual, receiver operating characteristic curve

Procedia PDF Downloads 350
3274 The Aspect of the Digital Formation in the Solar Community as One Prototype to Find the Algorithmic Sustainable Conditions in the Global Environment

Authors: Kunihisa Kakumoto

Abstract:

Purpose: The global environmental problem is now raised in the global dimension. The sprawl phenomenon over the natural limitation is to be made a forecast beforehand in an algorithmic way so that the condition of our social life can hopefully be protected under the natural limitation. The sustainable condition in the globe is now to be found to keep the balance between the capacity of nature and the possibility of our social lives. The amount of water on the earth is limited. Therefore, on the reason, sustainable conditions are strongly dependent on the capacity of water. The amount of water can be considered in relation to the area of the green planting because a certain volume of the water can be obtained in the forest, where the green planting can be preserved. We can find the sustainable conditions of the water in relation to the green planting area. The reduction of CO₂ by green planting is also possible. Possible Measure and the Methods: Until now, by the opportunity of many international conferences, the concept of the solar community as one prototype has been introduced by technical papers. The algorithmic trial calculation on the basic concept of the solar community can be taken into consideration. The concept of the solar community is based on the collected data of the solar model house. According to the algorithmic results of the prototype, the simulation work in the globe can be performed as the algorithmic conversion results. This algorithmic study can be simulated by the amount of water, also in relation to the green planting area. Additionally, the submission of CO₂ in the solar community and the reduction of CO₂ by green planting can be calculated. On the base of these calculations in the solar community, the sustainable conditions on the globe can be simulated as the conversion results in an algorithmic way. The digital formation in the solar community can also be taken into consideration by this opportunity. Conclusion: For the finding of sustainable conditions around the globe, the solar community as one prototype has been taken into consideration. The role of the water is very important because the capacity of the water supply is very limited. But, at present, the cycle of the social community is not composed by the point of the natural mechanism. The simulative calculation of this study can be shown by the limitation of the total water supply. According to this process, the total capacity of the water supply and the capable residential number of the population and the areas can be taken into consideration by the algorithmic calculation. For keeping enough water, the green planting areas are very important. The planting area is also very important to keep the balance of CO₂. The simulative calculation can be performed by the relation between the submission and the reduction of CO₂ in the solar community. For the finding of this total balance and the sustainable conditions, the green planting area and the total amount of water can be recognized by the algorithmic simulative calculation. The study for the finding of sustainable conditions can be performed by the simulative calculations on the algorithmic model in the solar community as one prototype. The example of one prototype can be in balance. The activity of the social life must be in the capacity of the natural mechanism. The capable capacity of the natural environment in our world is very limited.

Keywords: the solar community, the sustainable condition, the natural limitation, the algorithmic calculation

Procedia PDF Downloads 110
3273 Development of Algorithms for the Study of the Image in Digital Form for Satellite Applications: Extraction of a Road Network and Its Nodes

Authors: Zineb Nougrara

Abstract:

In this paper, we propose a novel methodology for extracting a road network and its nodes from satellite images of Algeria country. This developed technique is a progress of our previous research works. It is founded on the information theory and the mathematical morphology; the information theory and the mathematical morphology are combined together to extract and link the road segments to form a road network and its nodes. We, therefore, have to define objects as sets of pixels and to study the shape of these objects and the relations that exist between them. In this approach, geometric and radiometric features of roads are integrated by a cost function and a set of selected points of a crossing road. Its performances were tested on satellite images of Algeria country.

Keywords: satellite image, road network, nodes, image analysis and processing

Procedia PDF Downloads 274
3272 Development of an Implicit Coupled Partitioned Model for the Prediction of the Behavior of a Flexible Slender Shaped Membrane in Interaction with Free Surface Flow under the Influence of a Moving Flotsam

Authors: Mahtab Makaremi Masouleh, Günter Wozniak

Abstract:

This research is part of an interdisciplinary project, promoting the design of a light temporary installable textile defence system against flood. In case river water levels increase abruptly especially in winter time, one can expect massive extra load on a textile protective structure in term of impact as a result of floating debris and even tree trunks. Estimation of this impulsive force on such structures is of a great importance, as it can ensure the reliability of the design in critical cases. This fact provides the motivation for the numerical analysis of a fluid structure interaction application, comprising flexible slender shaped and free-surface water flow, where an accelerated heavy flotsam tends to approach the membrane. In this context, the analysis on both the behavior of the flexible membrane and its interaction with moving flotsam is conducted by finite elements based solvers of the explicit solver and implicit Abacus solver available as products of SIMULIA software. On the other hand, a study on how free surface water flow behaves in response to moving structures, has been investigated using the finite volume solver of Star CCM+ from Siemens PLM Software. An automatic communication tool (CSE, SIMULIA Co-Simulation Engine) and the implementation of an effective partitioned strategy in form of an implicit coupling algorithm makes it possible for partitioned domains to be interconnected powerfully. The applied procedure ensures stability and convergence in the solution of these complicated issues, albeit with high computational cost; however, the other complexity of this study stems from mesh criterion in the fluid domain, where the two structures approach each other. This contribution presents the approaches for the establishment of a convergent numerical solution and compares the results with experimental findings.

Keywords: co-simulation, flexible thin structure, fluid-structure interaction, implicit coupling algorithm, moving flotsam

Procedia PDF Downloads 389
3271 A Real-time Classification of Lying Bodies for Care Application of Elderly Patients

Authors: E. Vazquez-Santacruz, M. Gamboa-Zuniga

Abstract:

In this paper, we show a methodology for bodies classification in lying state using HOG descriptors and pressures sensors positioned in a matrix form (14 x 32 sensors) on the surface where bodies lie down. it will be done in real time. Our system is embedded in a care robot that can assist the elderly patient and medical staff around to get a better quality of life in and out of hospitals. Due to current technology a limited number of sensors is used, wich results in low-resolution data array, that will be used as image of 14 x 32 pixels. Our work considers the problem of human posture classification with few information (sensors), applying digital process to expand the original data of the sensors and so get more significant data for the classification, however, this is done with low-cost algorithms to ensure the real-time execution.

Keywords: real-time classification, sensors, robots, health care, elderly patients, artificial intelligence

Procedia PDF Downloads 866
3270 Construction and Validation of Allied Bank-Teller Aptitude Test

Authors: Muhammad Kashif Fida

Abstract:

In the bank, teller’s job (cash officer) is highly important and critical as at one end it requires soft and brisk customer services and on the other side, handling cash with integrity. It is always challenging for recruiters to hire competent and trustworthy tellers. According to author’s knowledge, there is no comprehensive test available that may provide assistance in recruitment in Pakistan. So there is a dire need of a psychometric battery that could provide support in recruitment of potential candidates for the teller’ position. So, the aim of the present study was to construct ABL-Teller Aptitude Test (ABL-TApT). Three major phases have been designed by following American Psychological Association’s guidelines. The first phase was qualitative, indicators of the test have been explored by content analysis of the a) teller’s job descriptions (n=3), b) interview with senior tellers (n=6) and c) interview with HR personals (n=4). Content analysis of above yielded three border constructs; i). Personality, ii). Integrity/honesty, iii). Professional Work Aptitude. Identified indicators operationalized and statements (k=170) were generated using verbatim. It was then forwarded to the five experts for review of content validity. They finalized 156 items. In the second phase; ABL-TApT (k=156) administered on 323 participants through a computer application. The overall reliability of the test shows significant alpha coefficient (α=.81). Reliability of subscales have also significant alpha coefficients. Confirmatory Factor Analysis (CFA) performed to estimate the construct validity, confirms four main factors comprising of eight personality traits (Confidence, Organized, Compliance, Goal-oriented, Persistent, Forecasting, Patience, Caution), one Integrity/honesty factor, four factors of professional work aptitude (basic numerical ability and perceptual accuracy of letters, numbers and signature) and two factors for customer services (customer services, emotional maturity). Values of GFI, AGFI, NNFI, CFI, RFI and RMSEA are in recommended range depicting significant model fit. In third phase concurrent validity evidences have been pursued. Personality and integrity part of this scale has significant correlations with ‘conscientiousness’ factor of NEO-PI-R, reflecting strong concurrent validity. Customer services and emotional maturity have significant correlations with ‘Bar-On EQI’ showing another evidence of strong concurrent validity. It is concluded that ABL-TAPT is significantly reliable and valid battery of tests, will assist in objective recruitment of tellers and help recruiters in finding a more suitable human resource.

Keywords: concurrent validity, construct validity, content validity, reliability, teller aptitude test, objective recruitment

Procedia PDF Downloads 226
3269 Implementation of a Web-Based Wireless ECG Measuring and Recording System

Authors: Onder Yakut, Serdar Solak, Emine Dogru Bolat

Abstract:

Measuring the Electrocardiogram (ECG) signal is an essential process for the diagnosis of the heart diseases. The ECG signal has the information of the degree of how much the heart performs its functions. In medical diagnosis and treatment systems, Decision Support Systems processing the ECG signal are being developed for the use of clinicians while medical examination. In this study, a modular wireless ECG (WECG) measuring and recording system using a single board computer and e-Health sensor platform is developed. In this designed modular system, after the ECG signal is taken from the body surface by the electrodes first, it is filtered and converted to digital form. Then, it is recorded to the health database using Wi-Fi communication technology. The real time access of the ECG data is provided through the internet utilizing the developed web interface.

Keywords: ECG, e-health sensor shield, Raspberry Pi, wiFi technology

Procedia PDF Downloads 401
3268 Optimal Placement of the Unified Power Controller to Improve the Power System Restoration

Authors: Mohammad Reza Esmaili

Abstract:

One of the most important parts of the restoration process of a power network is the synchronizing of its subsystems. In this situation, the biggest concern of the system operators will be the reduction of the standing phase angle (SPA) between the endpoints of the two islands. In this regard, the system operators perform various actions and maneuvers so that the synchronization operation of the subsystems is successfully carried out and the system finally reaches acceptable stability. The most common of these actions include load control, generation control and, in some cases, changing the network topology. Although these maneuvers are simple and common, due to the weak network and extreme load changes, the restoration will be associated with low speed. One of the best ways to control the SPA is to use FACTS devices. By applying a soft control signal, these tools can reduce the SPA between two subsystems with more speed and accuracy, and the synchronization process can be done in less time. Meanwhile, the unified power controller (UPFC), a series-parallel compensator device with the change of transmission line power and proper adjustment of the phase angle, will be the proposed option in order to realize the subject of this research. Therefore, with the optimal placement of UPFC in a power system, in addition to improving the normal conditions of the system, it is expected to be effective in reducing the SPA during power system restoration. Therefore, the presented paper provides an optimal structure to coordinate the three problems of improving the division of subsystems, reducing the SPA and optimal power flow with the aim of determining the optimal location of UPFC and optimal subsystems. The proposed objective functions in this paper include maximizing the quality of the subsystems, reducing the SPA at the endpoints of the subsystems, and reducing the losses of the power system. Since there will be a possibility of creating contradictions in the simultaneous optimization of the proposed objective functions, the structure of the proposed optimization problem is introduced as a non-linear multi-objective problem, and the Pareto optimization method is used to solve it. The innovative technique proposed to implement the optimization process of the mentioned problem is an optimization algorithm called the water cycle (WCA). To evaluate the proposed method, the IEEE 39 bus power system will be used.

Keywords: UPFC, SPA, water cycle algorithm, multi-objective problem, pareto

Procedia PDF Downloads 66
3267 ANAC-id - Facial Recognition to Detect Fraud

Authors: Giovanna Borges Bottino, Luis Felipe Freitas do Nascimento Alves Teixeira

Abstract:

This article aims to present a case study of the National Civil Aviation Agency (ANAC) in Brazil, ANAC-id. ANAC-id is the artificial intelligence algorithm developed for image analysis that recognizes standard images of unobstructed and uprighted face without sunglasses, allowing to identify potential inconsistencies. It combines YOLO architecture and 3 libraries in python - face recognition, face comparison, and deep face, providing robust analysis with high level of accuracy.

Keywords: artificial intelligence, deepface, face compare, face recognition, YOLO, computer vision

Procedia PDF Downloads 156
3266 Fabrication of Optical Tissue Phantoms Simulating Human Skin and Their Application

Authors: Jihoon Park, Sungkon Yu, Byungjo Jung

Abstract:

Although various optical tissue phantoms (OTPs) simulating human skin have been actively studied, their completeness is unclear because skin tissue has the intricate optical property and complicated structure disturbing the optical simulation. In this study, we designed multilayer OTP mimicking skin structure, and fabricated OTP models simulating skin-blood vessel and skin pigmentation in the skin, which are useful in Biomedical optics filed. The OTPs were characterized with the optical property and the cross-sectional structure, and analyzed by using various optical tools such as a laser speckle imaging system, OCT and a digital microscope to show the practicality. The measured optical property was within 5% error, and the thickness of each layer was uniform within 10% error in micrometer scale.

Keywords: blood vessel, optical tissue phantom, optical property, skin tissue, pigmentation

Procedia PDF Downloads 455
3265 The Factors Constitute the Interaction between Teachers and Students: An Empirical Study at the Notion of Framing

Authors: Tien-Hui Chiang

Abstract:

The code theory, proposed by Basil Bernstein, indicates that framing can be viewed as the core element in constituting the phenomenon of cultural reproduction because it is able to regulate the transmission of pedagogical information. Strong framing increases the social relation boundary between a teacher and pupils, which obstructs information transmission, so that in order to improve underachieving students’ academic performances, teachers need to reduce to strength of framing. Weak framing enables them to transform academic knowledge into commonsense knowledge in daily life language. This study posits that most teachers would deliver strong framing due to their belief mainly confined within the aspect of instrumental rationality that deprives their critical minds. This situation could make them view the normal distribution bell curve of students’ academic performances as a natural outcome. In order to examine the interplay between framing, instrumental rationality and pedagogical action, questionnaires were completed by over 5,000 primary school teachers in Henan province, China, who were stratified sample. The statistical results show that most teachers employed psychological concepts to measure students’ academic performances and, in turn, educational inequity was legitimatized as a natural outcome in the efficiency-led approach. Such efficiency-led minds made them perform as the agent practicing the mechanism of social control and in turn sustaining the phenomenon of cultural reproduction.

Keywords: code, cultural reproduction, framing, instrumental rationality, social relation and interaction

Procedia PDF Downloads 151
3264 Relevant LMA Features for Human Motion Recognition

Authors: Insaf Ajili, Malik Mallem, Jean-Yves Didier

Abstract:

Motion recognition from videos is actually a very complex task due to the high variability of motions. This paper describes the challenges of human motion recognition, especially motion representation step with relevant features. Our descriptor vector is inspired from Laban Movement Analysis method. We propose discriminative features using the Random Forest algorithm in order to remove redundant features and make learning algorithms operate faster and more effectively. We validate our method on MSRC-12 and UTKinect datasets.

Keywords: discriminative LMA features, features reduction, human motion recognition, random forest

Procedia PDF Downloads 195
3263 Closing the Gap: Efficient Voxelization with Equidistant Scanlines and Gap Detection

Authors: S. Delgado, C. Cerrada, R. S. Gómez

Abstract:

This research introduces an approach to voxelizing the surfaces of triangular meshes with efficiency and accuracy. Our method leverages parallel equidistant scan-lines and introduces a Gap Detection technique to address the limitations of existing approaches. We present a comprehensive study showcasing the method's effectiveness, scalability, and versatility in different scenarios. Voxelization is a fundamental process in computer graphics and simulations, playing a pivotal role in applications ranging from scientific visualization to virtual reality. Our algorithm focuses on enhancing the voxelization process, especially for complex models and high resolutions. One of the major challenges in voxelization in the Graphics Processing Unit (GPU) is the high cost of discovering the same voxels multiple times. These repeated voxels incur in costly memory operations with no useful information. Our scan-line-based method ensures that each voxel is detected exactly once when processing the triangle, enhancing performance without compromising the quality of the voxelization. The heart of our approach lies in the use of parallel, equidistant scan-lines to traverse the interiors of triangles. This minimizes redundant memory operations and avoids revisiting the same voxels, resulting in a significant performance boost. Moreover, our method's computational efficiency is complemented by its simplicity and portability. Written as a single compute shader in Graphics Library Shader Language (GLSL), it is highly adaptable to various rendering pipelines and hardware configurations. To validate our method, we conducted extensive experiments on a diverse set of models from the Stanford repository. Our results demonstrate not only the algorithm's efficiency, but also its ability to produce 26 tunnel free accurate voxelizations. The Gap Detection technique successfully identifies and addresses gaps, ensuring consistent and visually pleasing voxelized surfaces. Furthermore, we introduce the Slope Consistency Value metric, quantifying the alignment of each triangle with its primary axis. This metric provides insights into the impact of triangle orientation on scan-line based voxelization methods. It also aids in understanding how the Gap Detection technique effectively improves results by targeting specific areas where simple scan-line-based methods might fail. Our research contributes to the field of voxelization by offering a robust and efficient approach that overcomes the limitations of existing methods. The Gap Detection technique fills a critical gap in the voxelization process. By addressing these gaps, our algorithm enhances the visual quality and accuracy of voxelized models, making it valuable for a wide range of applications. In conclusion, "Closing the Gap: Efficient Voxelization with Equidistant Scan-lines and Gap Detection" presents an effective solution to the challenges of voxelization. Our research combines computational efficiency, accuracy, and innovative techniques to elevate the quality of voxelized surfaces. With its adaptable nature and valuable innovations, this technique could have a positive influence on computer graphics and visualization.

Keywords: voxelization, GPU acceleration, computer graphics, compute shaders

Procedia PDF Downloads 73
3262 Towards a Strategic Framework for State-Level Epistemological Functions

Authors: Mark Darius Juszczak

Abstract:

While epistemology, as a sub-field of philosophy, is generally concerned with theoretical questions about the nature of knowledge, the explosion in digital media technologies has resulted in an exponential increase in the storage and transmission of human information. That increase has resulted in a particular non-linear dynamic – digital epistemological functions are radically altering how and what we know. Neither the rate of that change nor the consequences of it have been well studied or taken into account in developing state-level strategies for epistemological functions. At the current time, US Federal policy, like that of virtually all other countries, maintains, at the national state level, clearly defined boundaries between various epistemological agencies - agencies that, in one way or another, mediate the functional use of knowledge. These agencies can take the form of patent and trademark offices, national library and archive systems, departments of education, departments such as the FTC, university systems and regulations, military research systems such as DARPA, federal scientific research agencies, medical and pharmaceutical accreditation agencies, federal funding for scientific research and legislative committees and subcommittees that attempt to alter the laws that govern epistemological functions. All of these agencies are in the constant process of creating, analyzing, and regulating knowledge. Those processes are, at the most general level, epistemological functions – they act upon and define what knowledge is. At the same time, however, there are no high-level strategic epistemological directives or frameworks that define those functions. The only time in US history where a proxy state-level epistemological strategy existed was between 1961 and 1969 when the Kennedy Administration committed the United States to the Apollo program. While that program had a singular technical objective as its outcome, that objective was so technologically advanced for its day and so complex so that it required a massive redirection of state-level epistemological functions – in essence, a broad and diverse set of state-level agencies suddenly found themselves working together towards a common epistemological goal. This paper does not call for a repeat of the Apollo program. Rather, its purpose is to investigate the minimum structural requirements for a national state-level epistemological strategy in the United States. In addition, this paper also seeks to analyze how the epistemological work of the multitude of national agencies within the United States would be affected by such a high-level framework. This paper is an exploratory study of this type of framework. The primary hypothesis of the author is that such a function is possible but would require extensive re-framing and reclassification of traditional epistemological functions at the respective agency level. In much the same way that, for example, DHS (Department of Homeland Security) evolved to respond to a new type of security threat in the world for the United States, it is theorized that a lack of coordination and alignment in epistemological functions will equally result in a strategic threat to the United States.

Keywords: strategic security, epistemological functions, epistemological agencies, Apollo program

Procedia PDF Downloads 77
3261 A Time-Reducible Approach to Compute Determinant |I-X|

Authors: Wang Xingbo

Abstract:

Computation of determinant in the form |I-X| is primary and fundamental because it can help to compute many other determinants. This article puts forward a time-reducible approach to compute determinant |I-X|. The approach is derived from the Newton’s identity and its time complexity is no more than that to compute the eigenvalues of the square matrix X. Mathematical deductions and numerical example are presented in detail for the approach. By comparison with classical approaches the new approach is proved to be superior to the classical ones and it can naturally reduce the computational time with the improvement of efficiency to compute eigenvalues of the square matrix.

Keywords: algorithm, determinant, computation, eigenvalue, time complexity

Procedia PDF Downloads 415
3260 Study of Circulatory MiR-122 and MiR-130a Expression among Chronic Hepatitis C Egyptian Patients

Authors: Hend K. Moosa, Eman A. Rashwan, Ezzat M. Hassan, Amany A. Ghazy, Amel G. Sheredy

Abstract:

The stability of microRNA (miR) in the circulation can show a great progress toward the discovery of non-invasive diagnostic and prognostic biomarkers in many diseases. In the present study, circulatory miR-122 and miR-130a were analysed in chronic hepatitis C Egyptian patients in predicting the clinical outcome of interferon treatment. In addition, their expression levels were correlated to viral RNA levels, necro-inflammatory markers (AST, ALT) and to each other. This study was conducted on 51 subjects where 36 were chronic HCV patients in which they were divided into naive and interferon treated HCV patients (responders and non-responders) and 15 matched healthy controls. Serum quantification of miR-122 and miR-130a were performed by quantitative Real-time Polymerase Chain Reaction (qRT-PCR). The results showed a significant upregulation of miR-122 in non-responder patients (P=0.049). By receiver operating characteristic analysis curve, miR-122 revealed 65% sensitivity and 92.3% specificity in predicting non-responsiveness of patients to IFN treatment, while miR-130a showed a sensitivity of 100% and specificity of 53.85%. Remarkably, there was a significant positive correlation between miR-122 and miR-130a in naive HCV patients (r=0.714, p=0.003). However, there was no significant correlation between serum miR-122, miR-130a expression levels and necro-inflammatory markers (AST, ALT). To conclude, miR-122 and miR-130a have a significant association with viral RNA levels and accordingly, they may have a synergistic power in promoting viral replication. Interestingly, miR-122 and miR-130a have a predictive power in predicting clinical outcome of IFN treatment which can be further studied in currently used drugs in order to reduce the socio-economic burden of potentially non-responders.

Keywords: hepatitis C, microRNA, miR-122, miR-130a

Procedia PDF Downloads 170
3259 Semiotics of the New Commercial Music Paradigm

Authors: Mladen Milicevic

Abstract:

This presentation will address how the statistical analysis of digitized popular music influences the music creation and emotionally manipulates consumers.Furthermore, it will deal with semiological aspect of uniformization of musical taste in order to predict the potential revenues generated by popular music sales. In the USA, we live in an age where most of the popular music (i.e. music that generates substantial revenue) has been digitized. It is safe to say that almost everything that was produced in last 10 years is already digitized (either available on iTunes, Spotify, YouTube, or some other platform). Depending on marketing viability and its potential to generate additional revenue most of the “older” music is still being digitized. Once the music gets turned into a digital audio file,it can be computer-analyzed in all kinds of respects, and the similar goes for the lyrics because they also exist as a digital text file, to which any kin of N Capture-kind of analysis may be applied. So, by employing statistical examination of different popular music metrics such as tempo, form, pronouns, introduction length, song length, archetypes, subject matter,and repetition of title, the commercial result may be predicted. Polyphonic HMI (Human Media Interface) introduced the concept of the hit song science computer program in 2003.The company asserted that machine learning could create a music profile to predict hit songs from its audio features Thus,it has been established that a successful pop song must include: 100 bpm or more;an 8 second intro;use the pronoun 'you' within 20 seconds of the start of the song; hit the bridge middle 8 between 2 minutes and 2 minutes 30 seconds; average 7 repetitions of the title; create some expectations and fill that expectation in the title. For the country song: 100 bpm or less for a male artist; 14-second intro; uses the pronoun 'you' within the first 20 seconds of the intro; has a bridge middle 8 between 2 minutes and 2 minutes 30 seconds; has 7 repetitions of title; creates an expectation,fulfills it in 60 seconds.This approach to commercial popular music minimizes the human influence when it comes to which “artist” a record label is going to sign and market. Twenty years ago,music experts in the A&R (Artists and Repertoire) departments of the record labels were making personal aesthetic judgments based on their extensive experience in the music industry. Now, the computer music analyzing programs, are replacing them in an attempt to minimize investment risk of the panicking record labels, in an environment where nobody can predict the future of the recording industry.The impact on the consumers taste through the narrow bottleneck of the above mentioned music selection by the record labels,created some very peculiar effects not only on the taste of popular music consumers, but also the creative chops of the music artists as well. What is the meaning of this semiological shift is the main focus of this research and paper presentation.

Keywords: music, semiology, commercial, taste

Procedia PDF Downloads 393
3258 A Systematic Literature Review of the Influence of New Media-Based Interventions on Drug Abuse

Authors: Wen Huei Chou, Te Lung Pan, Tsu Wen Yeh

Abstract:

New media have recently received increasing attention as a new communication form. The COVID-19 outbreak has pushed people’s lifestyles into the digital age, and the drug market has infiltrated formal e-commerce platforms. The self-media boom has fostered growth in online drug myths. To set the record straight, it is imperative to develop new media-based interventions. However, the usefulness of new media on this issue has not yet been fully examined. This study selected 13 articles on the development of new media-based interventions to prevent drug abuse from Airiti Library and Pub-Med as of October 3, 2021. The key conclusions are that (1) new media have a significantly positive influence on skills, self-efficacy, and behavior; (2) most interventions package traditional course learning into new media formats; and (3) new media can create a covert, interactive environment that cannot be replicated offline, which may merit attention in future research.

Keywords: drug abuse, interventions, new media, systematic review

Procedia PDF Downloads 152
3257 Effect of Angles Collision, Absorption, Dash and Their Relationship with the Finale Results Case the Algerian Elite Team Triple Jump

Authors: Guebli Abdelkader, Zerf Mohammed, Mekkades Moulay Idriss, BenGoua Ali, Atouti Nouredinne, Habchi Nawel

Abstract:

The paper aims to show the influence of angles in the results of triple jump. Whereas our background confirms that a series of motions are characterized by complex angles in the properties phase (hop, step, and jump) as a combination of the pushed phase on ultimate phases in the result. For the purpose, our results are obtained from the National Athletics Championship 2013, which was filmed and analysis by the software kinovea. Based on the statistical analysis we confirm: there is a positive relationship between angle of the leg, hip angle, angle of the trunk in the collision during (hop, step, and jump), and there is a negative correlation to the angle of the knee relationship in a collision during.

Keywords: kinematics variables, the triple jump, the finale results, digital achievement

Procedia PDF Downloads 327
3256 Vertical Uplift Capacity of a Group of Equally Spaced Helical Screw Anchors in Sand

Authors: Sanjeev Mukherjee, Satyendra Mittal

Abstract:

This paper presents the experimental investigations on the behaviour of a group of single, double and triple helical screw anchors embedded vertically at the same level in sand. The tests were carried out on one, two, three and four numbers of anchors in sand for different depths of embedment keeping shallow and deep mode of behaviour in mind. The testing program included 48 tests conducted on three model anchors installed in sand whose density kept constant throughout the tests. It was observed that the ultimate pullout load varied significantly with the installation depth of the anchor and the number of anchors. The apparent coefficient of friction (f*) between anchor and soil was also calculated based on the test results. It was found that the apparent coefficient of friction varies between 1.02 and 4.76 for 1, 2, 3, and 4 numbers of single, double and triple helical screw anchors. Plate load tests conducted on model soil showed that the value of ф increases from 35o for virgin soil to 48o for soil with four double screw helical anchors. The graphs of ultimate pullout capacity of a group of two, three and four no. of anchors with respect to one anchor were plotted and design equations have been proposed correlating them. Based on these findings, it has been concluded that the load-displacement relationships for all groups can be reduced to a common curve. A 3-D finite element model, PLAXIS, was used to confirm the results obtained from laboratory tests and the agreement is excellent.

Keywords: apparent coefficient of friction, helical screw anchor, installation depth, plate load test

Procedia PDF Downloads 555
3255 An Intelligent Text Independent Speaker Identification Using VQ-GMM Model Based Multiple Classifier System

Authors: Ben Soltane Cheima, Ittansa Yonas Kelbesa

Abstract:

Speaker Identification (SI) is the task of establishing identity of an individual based on his/her voice characteristics. The SI task is typically achieved by two-stage signal processing: training and testing. The training process calculates speaker specific feature parameters from the speech and generates speaker models accordingly. In the testing phase, speech samples from unknown speakers are compared with the models and classified. Even though performance of speaker identification systems has improved due to recent advances in speech processing techniques, there is still need of improvement. In this paper, a Closed-Set Tex-Independent Speaker Identification System (CISI) based on a Multiple Classifier System (MCS) is proposed, using Mel Frequency Cepstrum Coefficient (MFCC) as feature extraction and suitable combination of vector quantization (VQ) and Gaussian Mixture Model (GMM) together with Expectation Maximization algorithm (EM) for speaker modeling. The use of Voice Activity Detector (VAD) with a hybrid approach based on Short Time Energy (STE) and Statistical Modeling of Background Noise in the pre-processing step of the feature extraction yields a better and more robust automatic speaker identification system. Also investigation of Linde-Buzo-Gray (LBG) clustering algorithm for initialization of GMM, for estimating the underlying parameters, in the EM step improved the convergence rate and systems performance. It also uses relative index as confidence measures in case of contradiction in identification process by GMM and VQ as well. Simulation results carried out on voxforge.org speech database using MATLAB highlight the efficacy of the proposed method compared to earlier work.

Keywords: feature extraction, speaker modeling, feature matching, Mel frequency cepstrum coefficient (MFCC), Gaussian mixture model (GMM), vector quantization (VQ), Linde-Buzo-Gray (LBG), expectation maximization (EM), pre-processing, voice activity detection (VAD), short time energy (STE), background noise statistical modeling, closed-set tex-independent speaker identification system (CISI)

Procedia PDF Downloads 309
3254 Fabric Drapemeter Development towards the Analysis of Its Behavior in 3-D Design

Authors: Aida Sheeta, M. Nashat Fors, Sherwet El Gholmy, Marwa Issa

Abstract:

Globalization has raised the customer preferences not only towards the high-quality garments but also the right fitting, comfort and aesthetic apparels. This only can be accomplished by the good interaction between fabric mechanical and physical properties as well as the required style. Consequently, this paper provides an integrated review of the fabric drape terminology because it is considered as an essential feature in which the fabric can form folds with the help of the gravity. Moreover, an instrument has been fabricated in order to analyze the static and dynamic drape behaviors using different fabric types. In addition, the obtained results find out the parameters affecting the drape coefficient using digital image processing for various kind of commercial fabrics. This was found to be an essential first step in order to analyze the behavior of this fabric when it is fabricated in a certain 3-D garment design.

Keywords: cloth fitting, fabric drape nodes, garment silhouette, image processing

Procedia PDF Downloads 188
3253 Wikipedia World: A Computerized Process for Cultural Heritage Data Dissemination

Authors: L. Rajaonarivo, M. N. Bessagnet, C. Sallaberry, A. Le Parc Lacayrelle, L. Leveque

Abstract:

TCVPYR is a European FEDER (European Regional Development Fund) project which aims to promote tourism in the French Pyrenees region by leveraging its cultural heritage. It involves scientists from various domains (geographers, historians, anthropologists, computer scientists...). This paper presents a fully automated process to publish any dataset as Wikipedia articles as well as the corresponding linked information on Wikidata and Wikimedia Commons. We validate this process on a sample of geo-referenced cultural heritage data collected by TCVPYR researchers in different regions of the Pyrenees. The main result concerns the technological prerequisites, which are now in place. Moreover, we demonstrated that we can automatically publish cultural heritage data on Wikimedia.

Keywords: cultural heritage dissemination, digital humanities, open data, Wikimedia automated publishing

Procedia PDF Downloads 127
3252 Model Evaluation of Action Potential Block in Whole-Animal Nerves Induced by Ultrashort, High-Intensity Electric Pulses

Authors: Jiahui Song

Abstract:

There have been decades of research into the action potential block in nerves. To our best knowledge electrical voltages can reversibly block the conduction of action potentials across whole animal nerves. Blocking biological electrical signaling pathways can have a variety of applications in muscular and sensory incapacitation and clinical research, including urethral pressure reduction and relieving chronic pain relief from a peripheral nerve injury. The cessation ability has been used in muscle activation and fatigue reduction. Ultrashort, high-intensity electric pulses modulate the membrane conductivity to block nerve conduction through the electroporation process. Nanopore formation on the membrane surface would increase the local membrane conductivity and effectively "short-out" the trans-membrane potential of a nerve that inhibits action potential propagation. This block would be similar in concept to stopping the propagation of an air-pressure wave down a "leaky" pipe. This research focuses on a distributed electrical model with an additional time-dependent membrane conductance to calculate the poration induced by the ultrashort, high-intensity electric pulses. The changes in membrane conductivity are used to predict changes in action potential transmission. A "strength-duration (SD)" curve is generated for action potential blockage and would be used as a design guide for benchmarking safety thresholds or setting the pulse voltage and/or durations necessary for neuro-muscular incapacitation.

Keywords: action potential, ultrashort, high-intensity, nerve, strength-duration

Procedia PDF Downloads 18
3251 Media Literacy Development: A Methodology to Systematically Integrate Post-Contemporary Challenges in Early Childhood Education

Authors: Ana Mouta, Ana Paulino

Abstract:

The following text presents the ik.model, a theoretical framework that guided the pedagogical implementation of meaningful educational technology-based projects in formal education worldwide. In this paper, we will focus on how this framework has enabled the development of media literacy projects for early childhood education during the last three years. The methodology that guided educators through the challenge of systematically merging analogic and digital means in dialogic high-quality opportunities of world exploration is explained throughout these lines. The effects of this methodology on early age media literacy development are considered. Also considered is the relevance of this skill in terms of post-contemporary challenges posed to learning.

Keywords: early learning, ik.model, media literacy, pedagogy

Procedia PDF Downloads 324
3250 Five Years Analysis and Mitigation Plans on Adjustment Orders Impacts on Projects in Kuwait's Oil and Gas Sector

Authors: Rawan K. Al-Duaij, Salem A. Al-Salem

Abstract:

Projects, the unique and temporary process of achieving a set of requirements have always been challenging; Planning the schedule and budget, managing the resources and risks are mostly driven by a similar past experience or the technical consultations of experts in the matter. With that complexity of Projects in Scope, Time, and execution environment, Adjustment Orders are tools to reflect changes to the original project parameters after Contract signature. Adjustment Orders are the official/legal amendments to the terms and conditions of a live Contract. Reasons for issuing Adjustment Orders arise from changes in Contract scope, technical requirement and specification resulting in scope addition, deletion, or alteration. It can be as well a combination of most of these parameters resulting in an increase or decrease in time and/or cost. Most business leaders (handling projects in the interest of the owner) refrain from using Adjustment Orders considering their main objectives of staying within budget and on schedule. Success in managing the changes results in uninterrupted execution and agreed project costs as well as schedule. Nevertheless, this is not always practically achievable. In this paper, a detailed study through utilizing Industrial Engineering & Systems Management tools such as Six Sigma, Data Analysis, and Quality Control were implemented on the organization’s five years records of the issued Adjustment Orders in order to investigate their prevalence, and time and cost impact. The analysis outcome revealed and helped to identify and categorize the predominant causations with the highest impacts, which were considered most in recommending the corrective measures to reach the objective of minimizing the Adjustment Orders impacts. Data analysis demonstrated no specific trend in the AO frequency in past five years; however, time impact is more than the cost impact. Although Adjustment Orders might never be avoidable; this analysis offers’ some insight to the procedural gaps, and where it is highly impacting the organization. Possible solutions are concluded such as improving project handling team’s coordination and communication, utilizing a blanket service contract, and modifying the projects gate system procedures to minimize the possibility of having similar struggles in future. Projects in the Oil and Gas sector are always evolving and demand a certain amount of flexibility to sustain the goals of the field. As it will be demonstrated, the uncertainty of project parameters, in adequate project definition, operational constraints and stringent procedures are main factors resulting in the need for Adjustment Orders and accordingly the recommendation will be to address that challenge.

Keywords: adjustment orders, data analysis, oil and gas sector, systems management

Procedia PDF Downloads 165
3249 Arabic Handwriting Recognition Using Local Approach

Authors: Mohammed Arif, Abdessalam Kifouche

Abstract:

Optical character recognition (OCR) has a main role in the present time. It's capable to solve many serious problems and simplify human activities. The OCR yields to 70's, since many solutions has been proposed, but unfortunately, it was supportive to nothing but Latin languages. This work proposes a system of recognition of an off-line Arabic handwriting. This system is based on a structural segmentation method and uses support vector machines (SVM) in the classification phase. We have presented a state of art of the characters segmentation methods, after that a view of the OCR area, also we will address the normalization problems we went through. After a comparison between the Arabic handwritten characters & the segmentation methods, we had introduced a contribution through a segmentation algorithm.

Keywords: OCR, segmentation, Arabic characters, PAW, post-processing, SVM

Procedia PDF Downloads 72
3248 Plant Disease Detection Using Image Processing and Machine Learning

Authors: Sanskar, Abhinav Pal, Aryush Gupta, Sushil Kumar Mishra

Abstract:

One of the critical and tedious assignments in agricultural practices is the detection of diseases on vegetation. Agricultural production is very important in today’s economy because plant diseases are common, and early detection of plant diseases is important in agriculture. Automatic detection of such early diseases is useful because it reduces control efforts in large productive farms. Using digital image processing and machine learning algorithms, this paper presents a method for plant disease detection. Detection of the disease occurs on different leaves of the plant. The proposed system for plant disease detection is simple and computationally efficient, requiring less time than learning-based approaches. The accuracy of various plant and foliar diseases is calculated and presented in this paper.

Keywords: plant diseases, machine learning, image processing, deep learning

Procedia PDF Downloads 10
3247 Speaker Identification by Atomic Decomposition of Learned Features Using Computational Auditory Scene Analysis Principals in Noisy Environments

Authors: Thomas Bryan, Veton Kepuska, Ivica Kostanic

Abstract:

Speaker recognition is performed in high Additive White Gaussian Noise (AWGN) environments using principals of Computational Auditory Scene Analysis (CASA). CASA methods often classify sounds from images in the time-frequency (T-F) plane using spectrograms or cochleargrams as the image. In this paper atomic decomposition implemented by matching pursuit performs a transform from time series speech signals to the T-F plane. The atomic decomposition creates a sparsely populated T-F vector in “weight space” where each populated T-F position contains an amplitude weight. The weight space vector along with the atomic dictionary represents a denoised, compressed version of the original signal. The arraignment or of the atomic indices in the T-F vector are used for classification. Unsupervised feature learning implemented by a sparse autoencoder learns a single dictionary of basis features from a collection of envelope samples from all speakers. The approach is demonstrated using pairs of speakers from the TIMIT data set. Pairs of speakers are selected randomly from a single district. Each speak has 10 sentences. Two are used for training and 8 for testing. Atomic index probabilities are created for each training sentence and also for each test sentence. Classification is performed by finding the lowest Euclidean distance between then probabilities from the training sentences and the test sentences. Training is done at a 30dB Signal-to-Noise Ratio (SNR). Testing is performed at SNR’s of 0 dB, 5 dB, 10 dB and 30dB. The algorithm has a baseline classification accuracy of ~93% averaged over 10 pairs of speakers from the TIMIT data set. The baseline accuracy is attributable to short sequences of training and test data as well as the overall simplicity of the classification algorithm. The accuracy is not affected by AWGN and produces ~93% accuracy at 0dB SNR.

Keywords: time-frequency plane, atomic decomposition, envelope sampling, Gabor atoms, matching pursuit, sparse dictionary learning, sparse autoencoder

Procedia PDF Downloads 290
3246 Truthful or Untruthful Social Media Posts: Applying Statement Analysis to Decode online Deception

Authors: Christa L. Arnold, Margaret C. Stewart

Abstract:

This research shares the results of an exploratory study examining Statement Analysis (SA) to detect deception in online truthful and untruthful social media posts. Applying a Law Enforcement methodology SA, used in criminal interview statements, this research analyzes what is stated to assist in evaluating written deceptive information. Preliminary findings reveal qualitative and quantitative nuances for SA in online deception detection and uncover insights regarding digital deceptive behavior. Thus far, findings reveal truthful statements tend to differ from untruthful statements in both content and quality.

Keywords: deception detection, online deception, social media content, statement analysis

Procedia PDF Downloads 65