Search results for: ad-hoc mesh networks
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3175

Search results for: ad-hoc mesh networks

715 Multimodal Sentiment Analysis With Web Based Application

Authors: Shreyansh Singh, Afroz Ahmed

Abstract:

Sentiment Analysis intends to naturally reveal the hidden mentality that we hold towards an entity. The total of this assumption over a populace addresses sentiment surveying and has various applications. Current text-based sentiment analysis depends on the development of word embeddings and Machine Learning models that take in conclusion from enormous text corpora. Sentiment Analysis from text is presently generally utilized for consumer loyalty appraisal and brand insight investigation. With the expansion of online media, multimodal assessment investigation is set to carry new freedoms with the appearance of integral information streams for improving and going past text-based feeling examination using the new transforms methods. Since supposition can be distinguished through compelling follows it leaves, like facial and vocal presentations, multimodal opinion investigation offers good roads for examining facial and vocal articulations notwithstanding the record or printed content. These methodologies use the Recurrent Neural Networks (RNNs) with the LSTM modes to increase their performance. In this study, we characterize feeling and the issue of multimodal assessment investigation and audit ongoing advancements in multimodal notion examination in various spaces, including spoken surveys, pictures, video websites, human-machine, and human-human connections. Difficulties and chances of this arising field are additionally examined, promoting our theory that multimodal feeling investigation holds critical undiscovered potential.

Keywords: sentiment analysis, RNN, LSTM, word embeddings

Procedia PDF Downloads 112
714 The Analysis of Internet and Social Media Behaviors of the Students in Vocational High School

Authors: Mehmet Balci, Sakir Tasdemir, Mustafa Altin, Ozlem Bozok

Abstract:

Our globalizing world has become almost a small village and everyone can access any information at any time. Everyone lets each other know who does whatever in which place. We can learn which social events occur in which place in the world. From the perspective of education, the course notes that a lecturer use in lessons in a university in any state of America can be examined by a student studying in a city of Africa or the Far East. This dizzying communication we have mentioned happened thanks to fast developments in computer technologies and in parallel with this, internet technology. While these developments in the world, has a very large young population and a rapidly evolving electronic communications infrastructure Turkey has been affected by this situation. Researches has shown that almost all young people in Turkey has an account in a social network. Especially becoming common of mobile devices causes data traffic in social networks to increase. In this study, has been surveyed on students in the different age groups and at the Selcuk University Vocational School of Technical Sciences Department of Computer Technology. Student’s opinions about the use of internet and social media has been gotten. Using the Internet and social media skills, purposes, operating frequency, access facilities and tools, social life and effects on vocational education etc. have been explored. Both internet and use of social media positive and negative effects on this department students results have been obtained by the obtained findings evaluating from various aspects. Relations and differences have been found out with statistic.

Keywords: computer technologies, internet use, social network, higher vocational school

Procedia PDF Downloads 534
713 Computer-Integrated Surgery of the Human Brain, New Possibilities

Authors: Ugo Galvanetto, Pirto G. Pavan, Mirco Zaccariotto

Abstract:

The discipline of Computer-integrated surgery (CIS) will provide equipment able to improve the efficiency of healthcare systems and, which is more important, clinical results. Surgeons and machines will cooperate in new ways that will extend surgeons’ ability to train, plan and carry out surgery. Patient specific CIS of the brain requires several steps: 1 - Fast generation of brain models. Based on image recognition of MR images and equipped with artificial intelligence, image recognition techniques should differentiate among all brain tissues and segment them. After that, automatic mesh generation should create the mathematical model of the brain in which the various tissues (white matter, grey matter, cerebrospinal fluid …) are clearly located in the correct positions. 2 – Reliable and fast simulation of the surgical process. Computational mechanics will be the crucial aspect of the entire procedure. New algorithms will be used to simulate the mechanical behaviour of cutting through cerebral tissues. 3 – Real time provision of visual and haptic feedback A sophisticated human-machine interface based on ergonomics and psychology will provide the feedback to the surgeon. The present work will address in particular point 2. Modelling the cutting of soft tissue in a structure as complex as the human brain is an extremely challenging problem in computational mechanics. The finite element method (FEM), that accurately represents complex geometries and accounts for material and geometrical nonlinearities, is the most used computational tool to simulate the mechanical response of soft tissues. However, the main drawback of FEM lies in the mechanics theory on which it is based, classical continuum Mechanics, which assumes matter is a continuum with no discontinuity. FEM must resort to complex tools such as pre-defined cohesive zones, external phase-field variables, and demanding remeshing techniques to include discontinuities. However, all approaches to equip FEM computational methods with the capability to describe material separation, such as interface elements with cohesive zone models, X-FEM, element erosion, phase-field, have some drawbacks that make them unsuitable for surgery simulation. Interface elements require a-priori knowledge of crack paths. The use of XFEM in 3D is cumbersome. Element erosion does not conserve mass. The Phase Field approach adopts a diffusive crack model instead of describing true tissue separation typical of surgical procedures. Modelling discontinuities, so difficult when using computational approaches based on classical continuum Mechanics, is instead easy for novel computational methods based on Peridynamics (PD). PD is a non-local theory of mechanics formulated with no use of spatial derivatives. Its governing equations are valid at points or surfaces of discontinuity, and it is, therefore especially suited to describe crack propagation and fragmentation problems. Moreover, PD does not require any criterium to decide the direction of crack propagation or the conditions for crack branching or coalescence; in the PD-based computational methods, cracks develop spontaneously in the way which is the most convenient from an energy point of view. Therefore, in PD computational methods, crack propagation in 3D is as easy as it is in 2D, with a remarkable advantage with respect to all other computational techniques.

Keywords: computational mechanics, peridynamics, finite element, biomechanics

Procedia PDF Downloads 70
712 The Information-Seeking Behaviour of Kuwaiti Judges (KJs)

Authors: Essam Mansour

Abstract:

The key purpose of this study is to show information-seeking behaviour of Kuwaiti Judges (KJs). Being one of the few studies about the information needs and information-seeking behaviour conducted in Arab and developing countries, this study is a pioneer one among many studies conducted in information seeking, especially with this significant group of information users. The authors tried to investigate this seeking behavior in terms of KJs' thoughts, perceptions, motivations, techniques, preferences, tools and barriers met when seeking information. The authors employed a questionnaire, with a response rate 77.2 percent. This study showed that most of KJs were likely to be older, educated and with a work experience ranged from new to old experience. There is a statistically reliable significant difference between KJs' demographic characteristics and some sources of information, such as books, encyclopedias, references and mass media. KJs were using information moderately to make a decision, to be in line with current events, to collect statistics and to make a specific/general research. The office and home were the most frequent location KJs were accessing information from. KJs' efficiency level of the English language is described to be moderately good, and a little number of them confirmed that their efficiency level of French was not bad. The assistance provided by colleagues, followed by consultants, translators, sectaries and librarians were found to be most strong types of assistance needed when seeking information. Mobile apps, followed by PCs, information networks (the Internet) and information databases were the highest technology tool used by KJs. Printed materials, followed by non-printed and audiovisual materials were the most preferred information formats KJs use. The use of languages, the recency of information and the place of information, the deficit role of the library to deliver information were at least significant barriers to KJs when seeking information.

Keywords: information users, information-seeking behaviour, information needs, judges, Kuwait

Procedia PDF Downloads 301
711 Survival Strategies of Street Children Using the Urban Space: A Case Study at Sealdah Railway Station Area, Kolkata, West Bengal, India

Authors: Sibnath Sarkar

Abstract:

Developing countries are facing many Social problems. In India, too there are several such problems. The problem of street children is one of them. No country or city anywhere in the world today is without the presence of street children, but the problem is most acute in developing countries. Thousands of street children can be seen in our populous cities like Mumbai, Kolkata, Delhi, and Chennai. Most of them are in the age group of 5-15 years. The number of street children is increasing gradually. Poverty, unemployment, rapid urbanization, rural-urban migrations are the root causes of street children. Being deprive from many of their, they have escaped to the street as a safe place for living. Street children always related with the urban spaces in the developing world and it represents a sad outcome of the rapid urbanization process. After coming to the streets, these children have to cope with the new situation every day. They also adopt or develop many complex survival strategies and a variety of different informal or even illegal activities in public space and form supportive social networks in order to survive in street life. Street children use the different suitable urban spaces as their earning, living, entertaining spot. Therefore, the livelihoods of young people on the street should analyze in relation to the spaces they use, as well as their age and length of stay on the streets. This paper tries to explore the livelihood strategies and copping situation of street children in Sealdah station area. One hundred seventy-five street living children are included in the study living in and around the railway station.

Keywords: strategies, street children, survive, urban-space

Procedia PDF Downloads 349
710 A Qualitative Study of a Workplace International Employee Health Program

Authors: Jennifer Bradley

Abstract:

With opportunities to live and work abroad on the rise, effective preparation and support for international employees needs to be addressed within the work-site. International employees must build new habits, routines and social networks in an unfamiliar culture. Culture shock typically occurs within the first year and can affect both physical and psychological health. Employers have the opportunity to support staff through the adaptation process and foster healthy habits and routines. Cross-cultural training that includes a combination of instructional teaching, cultural experiences, and practice, is shown to increase the international employee adaptation process. However, little evidence demonstrates that organizations provide all of these aspects for international employees. The occupational therapy practitioner (OTP) offers a unique perspective focusing on the employee transactional relationship and engagement of meaningful occupations to enhance and enable participation in roles, habits and routines within new cultural contexts. This paper examines one such program developed and implemented by an OTP at the New England Center for Children, in Abu Dhabi, United Arab Emirates. The effectiveness of the program was assessed via participant feedback and concluded that an international employee support program that focuses on a variety of meaningful experiences and knowledge can empower employees to navigate healthy practices, develop habits and routines, and foster positive inter-cultural relationships in the organization and community.

Keywords: occupational therapy practitioner, cross cultural training, international employee health, international employee support

Procedia PDF Downloads 153
709 Ant Lion Optimization in a Fuzzy System for Benchmark Control Problem

Authors: Leticia Cervantes, Edith Garcia, Oscar Castillo

Abstract:

At today, there are several control problems where the main objective is to obtain the best control in the study to decrease the error in the application. Many techniques can use to control these problems such as Neural Networks, PID control, Fuzzy Logic, Optimization techniques and many more. In this case, fuzzy logic with fuzzy system and an optimization technique are used to control the case of study. In this case, Ant Lion Optimization is used to optimize a fuzzy system to control the velocity of a simple treadmill. The main objective is to achieve the control of the velocity in the control problem using the ALO optimization. First, a simple fuzzy system was used to control the velocity of the treadmill it has two inputs (error and error change) and one output (desired speed), then results were obtained but to decrease the error the ALO optimization was developed to optimize the fuzzy system of the treadmill. Having the optimization, the simulation was performed, and results can prove that using the ALO optimization the control of the velocity was better than a conventional fuzzy system. This paper describes some basic concepts to help to understand the idea in this work, the methodology of the investigation (control problem, fuzzy system design, optimization), the results are presented and the optimization is used for the fuzzy system. A comparison between the simple fuzzy system and the optimized fuzzy systems are presented where it can be proving the optimization improved the control with good results the major findings of the study is that ALO optimization is a good alternative to improve the control because it helped to decrease the error in control applications even using any control technique to optimized, As a final statement is important to mentioned that the selected methodology was good because the control of the treadmill was improve using the optimization technique.

Keywords: ant lion optimization, control problem, fuzzy control, fuzzy system

Procedia PDF Downloads 392
708 Metabolome-based Profiling of African Baobab Fruit (Adansonia Digitata L.) Using a Multiplex Approach of MS and NMR Techniques in Relation to Its Biological Activity

Authors: Marwa T. Badawy, Alaa F. Bakr, Nesrine Hegazi, Mohamed A. Farag, Ahmed Abdellatif

Abstract:

Diabetes Mellitus (DM) is a chronic disease affecting a large population worldwide. Africa is rich in native medicinal plants with myriad health benefits, though less explored towards the development of specific drug therapy as in diabetes. This study aims to determine the in vivo antidiabetic potential of the well-reported and traditionally used fruits of Baobab (Adansonia digitata L.) using STZ induced diabetic model. The in-vitro cytotoxic and antioxidant properties were examined using MTT assay on L-929 fibroblast cells and DPPH antioxidant assays, respectively. The extract showed minimal cytotoxicity with an IC50 value of 105.7 µg/mL. Histopathological and immunohistochemical investigations showed the hepatoprotective and the renoprotective effects of A. digitata fruits’ extract, implying its protective effects against diabetes complications. These findings were further supported by biochemical assays, which showed that i.p., injection of a low dose (150 mg/kg) of A. digitata twice a week lowered the fasting blood glucose levels, lipid profile, hepatic and renal markers. For a comprehensive overview of extract metabolites composition, ultrahigh performance (UHPLC) analysis coupled to high-resolution tandem mass spectrometry (HRMS/MS) in synchronization with molecular networks led to the annotation of 77 metabolites, among which 50% are reported for the first time in A. digitata fruits.

Keywords: adansonia digital, diabetes mellitus, metabolomics, streptozotocin, Sprague, dawley rats

Procedia PDF Downloads 155
707 Large-Scale Simulations of Turbulence Using Discontinuous Spectral Element Method

Authors: A. Peyvan, D. Li, J. Komperda, F. Mashayek

Abstract:

Turbulence can be observed in a variety fluid motions in nature and industrial applications. Recent investment in high-speed aircraft and propulsion systems has revitalized fundamental research on turbulent flows. In these systems, capturing chaotic fluid structures with different length and time scales is accomplished through the Direct Numerical Simulation (DNS) approach since it accurately simulates flows down to smallest dissipative scales, i.e., Kolmogorov’s scales. The discontinuous spectral element method (DSEM) is a high-order technique that uses spectral functions for approximating the solution. The DSEM code has been developed by our research group over the course of more than two decades. Recently, the code has been improved to run large cases in the order of billions of solution points. Running big simulations requires a considerable amount of RAM. Therefore, the DSEM code must be highly parallelized and able to start on multiple computational nodes on an HPC cluster with distributed memory. However, some pre-processing procedures, such as determining global element information, creating a global face list, and assigning global partitioning and element connection information of the domain for communication, must be done sequentially with a single processing core. A separate code has been written to perform the pre-processing procedures on a local machine. It stores the minimum amount of information that is required for the DSEM code to start in parallel, extracted from the mesh file, into text files (pre-files). It packs integer type information with a Stream Binary format in pre-files that are portable between machines. The files are generated to ensure fast read performance on different file-systems, such as Lustre and General Parallel File System (GPFS). A new subroutine has been added to the DSEM code to read the startup files using parallel MPI I/O, for Lustre, in a way that each MPI rank acquires its information from the file in parallel. In case of GPFS, in each computational node, a single MPI rank reads data from the file, which is specifically generated for the computational node, and send them to other ranks on the node using point to point non-blocking MPI communication. This way, communication takes place locally on each node and signals do not cross the switches of the cluster. The read subroutine has been tested on Argonne National Laboratory’s Mira (GPFS), National Center for Supercomputing Application’s Blue Waters (Lustre), San Diego Supercomputer Center’s Comet (Lustre), and UIC’s Extreme (Lustre). The tests showed that one file per node is suited for GPFS and parallel MPI I/O is the best choice for Lustre file system. The DSEM code relies on heavily optimized linear algebra operation such as matrix-matrix and matrix-vector products for calculation of the solution in every time-step. For this, the code can either make use of its matrix math library, BLAS, Intel MKL, or ATLAS. This fact and the discontinuous nature of the method makes the DSEM code run efficiently in parallel. The results of weak scaling tests performed on Blue Waters showed a scalable and efficient performance of the code in parallel computing.

Keywords: computational fluid dynamics, direct numerical simulation, spectral element, turbulent flow

Procedia PDF Downloads 129
706 Dental Ethics versus Malpractice, as Phenomenon with a Growing Trend

Authors: Saimir Heta, Kers Kapaj, Rialda Xhizdari, Ilma Robo

Abstract:

Dealing with emerging cases of dental malpractice with justifications that stem from the clear rules of dental ethics is a phenomenon with an increasing trend in today's dental practice. Dentists should clearly understand how far the limit of malpractice goes, with or without minimal or major consequences, for the affected patient, which can be justified as a complication of dental treatment, in support of the rules of dental ethics in the dental office. Indeed, malpractice can occur in cases of lack of professionalism, but it can also come as a consequence of anatomical and physiological limitations in the implementation of the dental protocols, predetermined and indicated by the patient in the paragraph of the treatment plan in his personal card. This study is of the review type with the aim of the latest findings published in the literature about the problem of dealing with these phenomena. The combination of keywords is done in such a way with the aim to give the necessary space for collecting the right information in the networks of publications about this field, always first from the point of view of the dentist and not from that of the lawyer or jurist. From the findings included in this article, it was noticed the diversity of approaches towards the phenomenon depends on the different countries based on the legal basis that these countries have. There is a lack of or a small number of articles that touch on this topic, and these articles are presented with a limited number of data on the same topic. Conclusions: Dental malpractice should not be hidden under the guise of various dental complications that we justify with the strict rules of ethics for patients treated in the dental chair. The individual experience of dental malpractice must be published with the aim of serving as a source of experience for future generations of dentists.

Keywords: dental ethics, malpractice, professional protocol, random deviation

Procedia PDF Downloads 90
705 Iterative Segmentation and Application of Hausdorff Dilation Distance in Defect Detection

Authors: S. Shankar Bharathi

Abstract:

Inspection of surface defects on metallic components has always been challenging due to its specular property. Occurrences of defects such as scratches, rust, pitting are very common in metallic surfaces during the manufacturing process. These defects if unchecked can hamper the performance and reduce the life time of such component. Many of the conventional image processing algorithms in detecting the surface defects generally involve segmentation techniques, based on thresholding, edge detection, watershed segmentation and textural segmentation. They later employ other suitable algorithms based on morphology, region growing, shape analysis, neural networks for classification purpose. In this paper the work has been focused only towards detecting scratches. Global and other thresholding techniques were used to extract the defects, but it proved to be inaccurate in extracting the defects alone. However, this paper does not focus on comparison of different segmentation techniques, but rather describes a novel approach towards segmentation combined with hausdorff dilation distance. The proposed algorithm is based on the distribution of the intensity levels, that is, whether a certain gray level is concentrated or evenly distributed. The algorithm is based on extraction of such concentrated pixels. Defective images showed higher level of concentration of some gray level, whereas in non-defective image, there seemed to be no concentration, but were evenly distributed. This formed the basis in detecting the defects in the proposed algorithm. Hausdorff dilation distance based on mathematical morphology was used to strengthen the segmentation of the defects.

Keywords: metallic surface, scratches, segmentation, hausdorff dilation distance, machine vision

Procedia PDF Downloads 420
704 An Integrated Framework for Seismic Risk Mitigation Decision Making

Authors: Mojtaba Sadeghi, Farshid Baniassadi, Hamed Kashani

Abstract:

One of the challenging issues faced by seismic retrofitting consultants and employers is quick decision-making on the demolition or retrofitting of a structure at the current time or in the future. For this reason, the existing models proposed by researchers have only covered one of the aspects of cost, execution method, and structural vulnerability. Given the effect of each factor on the final decision, it is crucial to devise a new comprehensive model capable of simultaneously covering all the factors. This study attempted to provide an integrated framework that can be utilized to select the most appropriate earthquake risk mitigation solution for buildings. This framework can overcome the limitations of current models by taking into account several factors such as cost, execution method, risk-taking and structural failure. In the newly proposed model, the database and essential information about retrofitting projects are developed based on the historical data on a retrofit project. In the next phase, an analysis is conducted in order to assess the vulnerability of the building under study. Then, artificial neural networks technique is employed to calculate the cost of retrofitting. While calculating the current price of the structure, an economic analysis is conducted to compare demolition versus retrofitting costs. At the next stage, the optimal method is identified. Finally, the implementation of the framework was demonstrated by collecting data concerning 155 previous projects.

Keywords: decision making, demolition, construction management, seismic retrofit

Procedia PDF Downloads 231
703 Enhancer: An Effective Transformer Architecture for Single Image Super Resolution

Authors: Pitigalage Chamath Chandira Peiris

Abstract:

A widely researched domain in the field of image processing in recent times has been single image super-resolution, which tries to restore a high-resolution image from a single low-resolution image. Many more single image super-resolution efforts have been completed utilizing equally traditional and deep learning methodologies, as well as a variety of other methodologies. Deep learning-based super-resolution methods, in particular, have received significant interest. As of now, the most advanced image restoration approaches are based on convolutional neural networks; nevertheless, only a few efforts have been performed using Transformers, which have demonstrated excellent performance on high-level vision tasks. The effectiveness of CNN-based algorithms in image super-resolution has been impressive. However, these methods cannot completely capture the non-local features of the data. Enhancer is a simple yet powerful Transformer-based approach for enhancing the resolution of images. A method for single image super-resolution was developed in this study, which utilized an efficient and effective transformer design. This proposed architecture makes use of a locally enhanced window transformer block to alleviate the enormous computational load associated with non-overlapping window-based self-attention. Additionally, it incorporates depth-wise convolution in the feed-forward network to enhance its ability to capture local context. This study is assessed by comparing the results obtained for popular datasets to those obtained by other techniques in the domain.

Keywords: single image super resolution, computer vision, vision transformers, image restoration

Procedia PDF Downloads 100
702 A Picture is worth a Billion Bits: Real-Time Image Reconstruction from Dense Binary Pixels

Authors: Tal Remez, Or Litany, Alex Bronstein

Abstract:

The pursuit of smaller pixel sizes at ever increasing resolution in digital image sensors is mainly driven by the stringent price and form-factor requirements of sensors and optics in the cellular phone market. Recently, Eric Fossum proposed a novel concept of an image sensor with dense sub-diffraction limit one-bit pixels (jots), which can be considered a digital emulation of silver halide photographic film. This idea has been recently embodied as the EPFL Gigavision camera. A major bottleneck in the design of such sensors is the image reconstruction process, producing a continuous high dynamic range image from oversampled binary measurements. The extreme quantization of the Poisson statistics is incompatible with the assumptions of most standard image processing and enhancement frameworks. The recently proposed maximum-likelihood (ML) approach addresses this difficulty, but suffers from image artifacts and has impractically high computational complexity. In this work, we study a variant of a sensor with binary threshold pixels and propose a reconstruction algorithm combining an ML data fitting term with a sparse synthesis prior. We also show an efficient hardware-friendly real-time approximation of this inverse operator. Promising results are shown on synthetic data as well as on HDR data emulated using multiple exposures of a regular CMOS sensor.

Keywords: binary pixels, maximum likelihood, neural networks, sparse coding

Procedia PDF Downloads 197
701 Vehicle Routing Problem Considering Alternative Roads under Triple Bottom Line Accounting

Authors: Onur Kaya, Ilknur Tukenmez

Abstract:

In this study, we consider vehicle routing problems on networks with alternative direct links between nodes, and we analyze a multi-objective problem considering the financial, environmental and social objectives in this context. In real life, there might exist several alternative direct roads between two nodes, and these roads might have differences in terms of their lengths and durations. For example, a road might be shorter than another but might require longer time due to traffic and speed limits. Similarly, some toll roads might be shorter or faster but require additional payment, leading to higher costs. We consider such alternative links in our problem and develop a mixed integer linear programming model that determines which alternative link to use between two nodes, in addition to determining the optimal routes for different vehicles, depending on the model objectives and constraints. We consider the minimum cost routing as the financial objective for the company, minimizing the CO2 emissions and gas usage as the environmental objectives, and optimizing the driver working conditions/working hours, and minimizing the risks of accidents as the social objectives. With these objective functions, we aim to determine which routes, and which alternative links should be used in addition to the speed choices on each link. We discuss the results of the developed vehicle routing models and compare their results depending on the system parameters.

Keywords: vehicle routing, alternative links between nodes, mixed integer linear programming, triple bottom line accounting

Procedia PDF Downloads 400
700 Seaworthiness and Liability Risks Involving Technology and Cybersecurity in Transport and Logistics

Authors: Eugene Wong, Felix Chan, Linsey Chen, Joey Cheung

Abstract:

The widespread use of technologies and cyber/digital means for complex maritime operations have led to a sharp rise in global cyber-attacks. They have generated an increasing number of liability disputes, insurance claims, and legal proceedings. An array of antiquated case law, regulations, international conventions, and obsolete contractual clauses drafted in the pre-technology era have become grossly inadequate in addressing the contemporary challenges. This paper offers a critique of the ambiguity of cybersecurity liabilities under the obligation of seaworthiness entailed in the Hague-Visby Rules, which apply either by law in a large number of jurisdictions or by express incorporation into the shipping documents. This paper also evaluates the legal and technological criteria for assessing whether a vessel is properly equipped with the latest offshore technologies for navigation and cargo delivery operations. Examples include computer applications, networks and servers, enterprise systems, global positioning systems, and data centers. A critical analysis of the carriers’ obligations to exercise due diligence in preventing or mitigating cyber-attacks is also conducted in this paper. It is hoped that the present study will offer original and crucial insights to policymakers, regulators, carriers, cargo interests, and insurance underwriters closely involved in dispute prevention and resolution arising from cybersecurity liabilities.

Keywords: seaworthiness, cybersecurity, liabilities, risks, maritime, transport

Procedia PDF Downloads 130
699 Geopolitical Implications and the Role of LinkedIn in the Russo-Ukrainian War: A Comprehensive Analysis of Social Media in Crisis Situations

Authors: Amber Brittain-Hale

Abstract:

This research investigates the evolving role of social media in crisis situations by employing discourse analysis methodology and honing in on the Russo-Ukrainian War, particularly Ukraine's use of LinkedIn. The study posits that social media platforms, such as LinkedIn, play a crucial role in shaping communication, disseminating information, and influencing geopolitical strategies during conflicts. Focusing on Ukraine's official state account on LinkedIn and analyzing its posts and interactions, the research aims to unveil discourse dynamics in high-stakes scenarios and provide valuable insights for leaders navigating complex global challenges. A comprehensive analysis of the data will contribute to a deeper understanding of the tactics adopted by political leaders in managing communication, the bidirectional nature of discourse provided by online social networks, and the rapid advancement of technology that has led to the growing significance of social media platforms in crisis situations. Through this approach, the geopolitical factors that influenced the country's social media strategy during the Russo-Ukrainian War will be illuminated, offering a broader perspective on the role of social media in such challenging times. Ultimately, the study seeks to uncover lessons that can be drawn from Ukraine's LinkedIn approach, informing future strategies for utilizing social media during crises and advancing the understanding of how social media can be harnessed to address intricate global issues.

Keywords: russo-ukrainian war, social media, crisis, discourse analysis

Procedia PDF Downloads 108
698 Structural Health Monitoring of Offshore Structures Using Wireless Sensor Networking under Operational and Environmental Variability

Authors: Srinivasan Chandrasekaran, Thailammai Chithambaram, Shihas A. Khader

Abstract:

The early-stage damage detection in offshore structures requires continuous structural health monitoring and for the large area the position of sensors will also plays an important role in the efficient damage detection. Determining the dynamic behavior of offshore structures requires dense deployment of sensors. The wired Structural Health Monitoring (SHM) systems are highly expensive and always needs larger installation space to deploy. Wireless sensor networks can enhance the SHM system by deployment of scalable sensor network, which consumes lesser space. This paper presents the results of wireless sensor network based Structural Health Monitoring method applied to a scaled experimental model of offshore structure that underwent wave loading. This method determines the serviceability of the offshore structure which is subjected to various environment loads. Wired and wireless sensors were installed in the model and the response of the scaled BLSRP model under wave loading was recorded. The wireless system discussed in this study is the Raspberry pi board with Arm V6 processor which is programmed to transmit the data acquired by the sensor to the server using Wi-Fi adapter, the data is then hosted in the webpage. The data acquired from the wireless and wired SHM systems were compared and the design of the wireless system is verified.

Keywords: condition assessment, damage detection, structural health monitoring, structural response, wireless sensor network

Procedia PDF Downloads 271
697 Freedom of Speech and Involvement in Hatred Speech on Social Media Networks

Authors: Sara Chinnasamy, Michelle Gun, M. Adnan Hashim

Abstract:

Federal Constitution guarantees Malaysians the right to free speech and expression; yet hatred speech can be commonly found on social media platforms such as Facebook, Twitter, and Instagram. In Malaysia social media sphere, most hatred speech involves religion, race and politics. Recent cases of racial attacks on social media have created social tensions among Malaysians. Many Malaysians always argue on their rights to freedom of speech. However, there are laws that limit their expression to the public and protecting social media users from being a victim of hate speech. This paper aims to explore the attitude and involvement of Malaysian netizens towards freedom of speech and hatred speech on social media. It also examines the relationship between involvement in hatred speech among Malaysian netizens and attitude towards freedom of speech. For most Malaysians, practicing total freedom of speech in the open is unthinkable. As a result, the best channel to articulate their feelings and opinions liberally is the internet. With the advent of the internet medium, more and more Malaysians are conveying their viewpoints using the various internet channels although sensitivity of the audience is seldom taken into account. Consequently, this situation has led to pockets of social disharmony among the citizens. Although this unhealthy activity is denounced by the authority, netizens are generally of the view that they have the right to write anything they want. Using the quantitative method, survey was conducted among Malaysians aged between 18 and 50 years who are active social media users. Results from the survey reveal that despite a weak relationship level between hatred speech involvement on social media and attitude towards freedom of speech, the association is still considerably significant. As such, it can be safely presumed that hatred speech on social media occurs due to the freedom of speech that exists by way of social media channels.

Keywords: freedom of speech, hatred speech, social media, Malaysia, netizens

Procedia PDF Downloads 446
696 Developing Performance Model for Road Side Elements Receiving Periodic Maintenance

Authors: Ayman M. Othman, Hassan Y. Ahmed, Tallat A. Ali

Abstract:

Inadequate maintenance programs and funds allocated for highway networks in the developed countries have led to fast deterioration of road side elements. Therefore, this research focuses on developing a performance model for road side elements periodic maintenance activities. Road side elements that receive periodic maintenance include; earthen shoulder, road signs and traffic markings. Using the level of service concept, the developed model can determine the optimal periodic maintenance intervals for those elements based on a selected level of service suitable with the available periodic maintenance budget. Data related to time periods for progressive deterioration stages for the chosen elements were collected. Ten maintenance experts in Aswan, Sohag and Assiut cities were interviewed for that purpose. Time in months related to 10%, 25%, 40%, 50%, 75%, 90% and 100% deterioration of each road side element was estimated based on the experts opinion. Least square regression analysis has shown that a power function represents the best fit for earthen shoulders edge drop-off and damage of road signs with time. It was also evident that, the progressive dirtiness of road signs could be represented by a quadratic function an a linear function could represent the paint degradation nature of both traffic markings and road signs. Actual measurements of earthen shoulder edge drop-off agree considerably with the developed model.

Keywords: deterioration, level of service, periodic maintenance, performance model, road side element

Procedia PDF Downloads 565
695 Resiliency in Fostering: A Qualitative Study of Highly Experienced Foster Parents

Authors: Ande Nesmith

Abstract:

There is an ongoing shortage of foster parents worldwide to take on a growing population of children in need of out-of-home care. Currently, resources are primarily aimed at recruitment rather than retention. Retention rates are extraordinarily low, especially in the first two years of fostering. Qualitative interviews with 19 foster parents averaging 20 years of service provided insight into the challenges they faced and how they overcame them. Thematic analysis of interview transcripts identified sources of stress and resiliency. Key stressors included lack of support and responsiveness from the children’s social workers, false maltreatment allegations, and secondary trauma from children’s destructive behaviors and emotional dysregulation. Resilient parents connected with other foster parents for support, engaged in creative problem-solving, recognized that positive feedback from children usually arrives years later, and through training, understood the neurobiological impact of trauma on child behavior. Recommendations include coordinating communication between the foster parent licensing agency social workers and the children’s social workers, creating foster parent support networks and mentoring, and continuous training on trauma including effective parenting strategies. Research is needed to determine whether these resilience indicators in fact lead to long-term retention. Policies should include a mechanism to develop a cohesive line of communication and connection between foster parents and the children’s social workers as well as their respective agencies.

Keywords: foster care stability, foster parent burnout, foster parent resiliency, foster parent retention, trauma-informed fostering

Procedia PDF Downloads 344
694 Task Validity in Neuroimaging Studies: Perspectives from Applied Linguistics

Authors: L. Freeborn

Abstract:

Recent years have seen an increasing number of neuroimaging studies related to language learning as imaging techniques such as fMRI and EEG have become more widely accessible to researchers. By using a variety of structural and functional neuroimaging techniques, these studies have already made considerable progress in terms of our understanding of neural networks and processing related to first and second language acquisition. However, the methodological designs employed in neuroimaging studies to test language learning have been questioned by applied linguists working within the field of second language acquisition (SLA). One of the major criticisms is that tasks designed to measure language learning gains rarely have a communicative function, and seldom assess learners’ ability to use the language in authentic situations. This brings the validity of many neuroimaging tasks into question. The fundamental reason why people learn a language is to communicate, and it is well-known that both first and second language proficiency are developed through meaningful social interaction. With this in mind, the SLA field is in agreement that second language acquisition and proficiency should be measured through learners’ ability to communicate in authentic real-life situations. Whilst authenticity is not always possible to achieve in a classroom environment, the importance of task authenticity should be reflected in the design of language assessments, teaching materials, and curricula. Tasks that bear little relation to how language is used in real-life situations can be considered to lack construct validity. This paper first describes the typical tasks used in neuroimaging studies to measure language gains and proficiency, then analyses to what extent these tasks can validly assess these constructs.

Keywords: neuroimaging studies, research design, second language acquisition, task validity

Procedia PDF Downloads 131
693 Irrigation Potential Assessment for Eastern Ganga Canal, India Using Geographic Information System

Authors: Deepak Khare, Radha Krishan, Bhaskar Nikam

Abstract:

The present study deals with the results of the Ortho-rectified Cartosat-1 PAN (2.5 m resolution) satellite data analysis for the extraction of canal networks under the Eastern Ganga Canal (EGC) command. Based on the information derived through the remote sensing data, in terms of the number of canals, their physical status and hydraulic connectivity from the source, irrigation potential (IP) created in the command was assessed by comparing with planned/design canal-wise irrigation potentials. All the geospatial information generated in the study is organized in a geodatabase. The EGC project irrigates the command through one main canal, five branch canals, 36 distributaries and 186 minors. The study was conducted with the main objectives of inventory and mapping of irrigation infrastructure using geographic information system (GIS), remote sensing and field data. Likewise, the assessment of irrigation potential created using the mapped infrastructure was performed as on March 2017. Results revealed that the canals were not only pending but were also having gap/s, and hydraulically disconnected in each branch canal and also in minors of EGC. A total of 16622.3 ha of commands were left untouched with canal water just due to the presence of gaps in the EGC project. The sum of all the gaps present in minor canals was 11.92 km, while in distributary, it was 2.63 km. This is a very good scenario that balances IP can be achieved by working on the gaps present in minor canals. Filling the gaps in minor canals can bring most of the area under irrigation, especially the tail reaches command.

Keywords: canal command, GIS, hydraulic connectivity, irrigation potential

Procedia PDF Downloads 139
692 Deep Learning Approach for Chronic Kidney Disease Complications

Authors: Mario Isaza-Ruget, Claudia C. Colmenares-Mejia, Nancy Yomayusa, Camilo A. González, Andres Cely, Jossie Murcia

Abstract:

Quantification of risks associated with complications development from chronic kidney disease (CKD) through accurate survival models can help with patient management. A retrospective cohort that included patients diagnosed with CKD from a primary care program and followed up between 2013 and 2018 was carried out. Time-dependent and static covariates associated with demographic, clinical, and laboratory factors were included. Deep Learning (DL) survival analyzes were developed for three CKD outcomes: CKD stage progression, >25% decrease in Estimated Glomerular Filtration Rate (eGFR), and Renal Replacement Therapy (RRT). Models were evaluated and compared with Random Survival Forest (RSF) based on concordance index (C-index) metric. 2.143 patients were included. Two models were developed for each outcome, Deep Neural Network (DNN) model reported C-index=0.9867 for CKD stage progression; C-index=0.9905 for reduction in eGFR; C-index=0.9867 for RRT. Regarding the RSF model, C-index=0.6650 was reached for CKD stage progression; decreased eGFR C-index=0.6759; RRT C-index=0.8926. DNN models applied in survival analysis context with considerations of longitudinal covariates at the start of follow-up can predict renal stage progression, a significant decrease in eGFR and RRT. The success of these survival models lies in the appropriate definition of survival times and the analysis of covariates, especially those that vary over time.

Keywords: artificial intelligence, chronic kidney disease, deep neural networks, survival analysis

Procedia PDF Downloads 129
691 Variance-Aware Routing and Authentication Scheme for Harvesting Data in Cloud-Centric Wireless Sensor Networks

Authors: Olakanmi Oladayo Olufemi, Bamifewe Olusegun James, Badmus Yaya Opeyemi, Adegoke Kayode

Abstract:

The wireless sensor network (WSN) has made a significant contribution to the emergence of various intelligent services or cloud-based applications. Most of the time, these data are stored on a cloud platform for efficient management and sharing among different services or users. However, the sensitivity of the data makes them prone to various confidentiality and performance-related attacks during and after harvesting. Various security schemes have been developed to ensure the integrity and confidentiality of the WSNs' data. However, their specificity towards particular attacks and the resource constraint and heterogeneity of WSNs make most of these schemes imperfect. In this paper, we propose a secure variance-aware routing and authentication scheme with two-tier verification to collect, share, and manage WSN data. The scheme is capable of classifying WSN into different subnets, detecting any attempt of wormhole and black hole attack during harvesting, and enforcing access control on the harvested data stored in the cloud. The results of the analysis showed that the proposed scheme has more security functionalities than other related schemes, solves most of the WSNs and cloud security issues, prevents wormhole and black hole attacks, identifies the attackers during data harvesting, and enforces access control on the harvested data stored in the cloud at low computational, storage, and communication overheads.

Keywords: data block, heterogeneous IoT network, data harvesting, wormhole attack, blackhole attack access control

Procedia PDF Downloads 68
690 Distributed System Computing Resource Scheduling Algorithm Based on Deep Reinforcement Learning

Authors: Yitao Lei, Xingxiang Zhai, Burra Venkata Durga Kumar

Abstract:

As the quantity and complexity of computing in large-scale software systems increase, distributed system computing becomes increasingly important. The distributed system realizes high-performance computing by collaboration between different computing resources. If there are no efficient resource scheduling resources, the abuse of distributed computing may cause resource waste and high costs. However, resource scheduling is usually an NP-hard problem, so we cannot find a general solution. However, some optimization algorithms exist like genetic algorithm, ant colony optimization, etc. The large scale of distributed systems makes this traditional optimization algorithm challenging to work with. Heuristic and machine learning algorithms are usually applied in this situation to ease the computing load. As a result, we do a review of traditional resource scheduling optimization algorithms and try to introduce a deep reinforcement learning method that utilizes the perceptual ability of neural networks and the decision-making ability of reinforcement learning. Using the machine learning method, we try to find important factors that influence the performance of distributed system computing and help the distributed system do an efficient computing resource scheduling. This paper surveys the application of deep reinforcement learning on distributed system computing resource scheduling proposes a deep reinforcement learning method that uses a recurrent neural network to optimize the resource scheduling, and proposes the challenges and improvement directions for DRL-based resource scheduling algorithms.

Keywords: resource scheduling, deep reinforcement learning, distributed system, artificial intelligence

Procedia PDF Downloads 102
689 Towards Green(er) Cities: The Role of Spatial Planning in Realising the Green Agenda

Authors: Elizelle Juaneé Cilliers

Abstract:

The green hype is becoming stronger within various disciplines, modern practices and academic thinking, enforced by concepts such as eco-health, eco-tourism, eco-cities, and eco-engineering. There is currently also an expanded scientific understanding regarding the value and benefits relating to green infrastructure, for both communities and their host cities, linked to broader sustainability and resilience thinking. The integration and implementation of green infrastructure as part of spatial planning approaches and municipal planning, are, however, more complex, especially in South Africa, inflated by limitations of budgets and human resources, development pressures, inequities in terms of green space availability and political legacies of the past. The prevailing approach to spatial planning is further contributing to complexity, linked to misguided perceptions of the function and value of green infrastructure. As such, green spaces are often considered a luxury, and green infrastructure a costly alternative, resulting in green networks being susceptible to land-use changes and under-prioritized in local authority decision-making. Spatial planning, in this sense, may well be a valuable tool to realise the green agenda, encapsulating various initiatives of sustainability as provided by a range of disciplines. This paper aims to clarify the importance and value of green infrastructure planning as a component of spatial planning approaches, in order to inform and encourage local authorities to embed sustainability thinking into city planning and decision-making approaches. It reflects on the decisive role of land-use management to guide the green agenda and refers to some recent planning initiatives. Lastly, it calls for trans-disciplinary planning approaches to build a case towards green(er) cities.

Keywords: green infrastructure, spatial planning, transdisciplinary, integrative

Procedia PDF Downloads 245
688 Linguistic Insights Improve Semantic Technology in Medical Research and Patient Self-Management Contexts

Authors: William Michael Short

Abstract:

Semantic Web’ technologies such as the Unified Medical Language System Metathesaurus, SNOMED-CT, and MeSH have been touted as transformational for the way users access online medical and health information, enabling both the automated analysis of natural-language data and the integration of heterogeneous healthrelated resources distributed across the Internet through the use of standardized terminologies that capture concepts and relationships between concepts that are expressed differently across datasets. However, the approaches that have so far characterized ‘semantic bioinformatics’ have not yet fulfilled the promise of the Semantic Web for medical and health information retrieval applications. This paper argues within the perspective of cognitive linguistics and cognitive anthropology that four features of human meaning-making must be taken into account before the potential of semantic technologies can be realized for this domain. First, many semantic technologies operate exclusively at the level of the word. However, texts convey meanings in ways beyond lexical semantics. For example, transitivity patterns (distributions of active or passive voice) and modality patterns (configurations of modal constituents like may, might, could, would, should) convey experiential and epistemic meanings that are not captured by single words. Language users also naturally associate stretches of text with discrete meanings, so that whole sentences can be ascribed senses similar to the senses of words (so-called ‘discourse topics’). Second, natural language processing systems tend to operate according to the principle of ‘one token, one tag’. For instance, occurrences of the word sound must be disambiguated for part of speech: in context, is sound a noun or a verb or an adjective? In syntactic analysis, deterministic annotation methods may be acceptable. But because natural language utterances are typically characterized by polyvalency and ambiguities of all kinds (including intentional ambiguities), such methods leave the meanings of texts highly impoverished. Third, ontologies tend to be disconnected from everyday language use and so struggle in cases where single concepts are captured through complex lexicalizations that involve profile shifts or other embodied representations. More problematically, concept graphs tend to capture ‘expert’ technical models rather than ‘folk’ models of knowledge and so may not match users’ common-sense intuitions about the organization of concepts in prototypical structures rather than Aristotelian categories. Fourth, and finally, most ontologies do not recognize the pervasively figurative character of human language. However, since the time of Galen the widespread use of metaphor in the linguistic usage of both medical professionals and lay persons has been recognized. In particular, metaphor is a well-documented linguistic tool for communicating experiences of pain. Because semantic medical knowledge-bases are designed to help capture variations within technical vocabularies – rather than the kinds of conventionalized figurative semantics that practitioners as well as patients actually utilize in clinical description and diagnosis – they fail to capture this dimension of linguistic usage. The failure of semantic technologies in these respects degrades the efficiency and efficacy not only of medical research, where information retrieval inefficiencies can lead to direct financial costs to organizations, but also of care provision, especially in contexts of patients’ self-management of complex medical conditions.

Keywords: ambiguity, bioinformatics, language, meaning, metaphor, ontology, semantic web, semantics

Procedia PDF Downloads 126
687 Enhancing Email Security: A Multi-Layered Defense Strategy Approach and an AI-Powered Model for Identifying and Mitigating Phishing Attacks

Authors: Anastasios Papathanasiou, George Liontos, Athanasios Katsouras, Vasiliki Liagkou, Euripides Glavas

Abstract:

Email remains a crucial communication tool due to its efficiency, accessibility and cost-effectiveness, enabling rapid information exchange across global networks. However, the global adoption of email has also made it a prime target for cyber threats, including phishing, malware and Business Email Compromise (BEC) attacks, which exploit its integral role in personal and professional realms in order to perform fraud and data breaches. To combat these threats, this research advocates for a multi-layered defense strategy incorporating advanced technological tools such as anti-spam and anti-malware software, machine learning algorithms and authentication protocols. Moreover, we developed an artificial intelligence model specifically designed to analyze email headers and assess their security status. This AI-driven model examines various components of email headers, such as "From" addresses, ‘Received’ paths and the integrity of SPF, DKIM and DMARC records. Upon analysis, it generates comprehensive reports that indicate whether an email is likely to be malicious or benign. This capability empowers users to identify potentially dangerous emails promptly, enhancing their ability to avoid phishing attacks, malware infections and other cyber threats.

Keywords: email security, artificial intelligence, header analysis, threat detection, phishing, DMARC, DKIM, SPF, ai model

Procedia PDF Downloads 43
686 Deployment of Beyond 4G Wireless Communication Networks with Carrier Aggregation

Authors: Bahram Khan, Anderson Rocha Ramos, Rui R. Paulo, Fernando J. Velez

Abstract:

With the growing demand for a new blend of applications, the users dependency on the internet is increasing day by day. Mobile internet users are giving more attention to their own experiences, especially in terms of communication reliability, high data rates and service stability on move. This increase in the demand is causing saturation of existing radio frequency bands. To address these challenges, researchers are investigating the best approaches, Carrier Aggregation (CA) is one of the newest innovations, which seems to fulfill the demands of the future spectrum, also CA is one the most important feature for Long Term Evolution - Advanced (LTE-Advanced). For this purpose to get the upcoming International Mobile Telecommunication Advanced (IMT-Advanced) mobile requirements (1 Gb/s peak data rate), the CA scheme is presented by 3GPP, which would sustain a high data rate using widespread frequency bandwidth up to 100 MHz. Technical issues such as aggregation structure, its implementations, deployment scenarios, control signal techniques, and challenges for CA technique in LTE-Advanced, with consideration of backward compatibility, are highlighted in this paper. Also, performance evaluation in macro-cellular scenarios through a simulation approach is presented, which shows the benefits of applying CA, low-complexity multi-band schedulers in service quality, system capacity enhancement and concluded that enhanced multi-band scheduler is less complex than the general multi-band scheduler, which performs better for a cell radius longer than 1800 m (and a PLR threshold of 2%).

Keywords: component carrier, carrier aggregation, LTE-advanced, scheduling

Procedia PDF Downloads 193