Search results for: Russian intelligence
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1709

Search results for: Russian intelligence

539 The Development of Monk’s Food Bowl Production on Occupational Health Safety and Environment at Work for the Strength of Rattanakosin Local Wisdom

Authors: Thammarak Srimarut, Witthaya Mekhum

Abstract:

This study analysed and developed a model for monk’s food bowl production on occupational health safety and environment at work for the encouragement of Rattanakosin local wisdom at Banbart Community. The process of blowpipe welding was necessary to produce the bowl which was very dangerous or 93.59% risk. After the employment of new sitting posture, the work risk was lower 48.41% or moderate risk. When considering in details, it was found that: 1) the traditional sitting posture could create work risk at 88.89% while the new sitting posture could create the work risk at 58.86%. 2) About the environmental pollution, with the traditional sitting posture, workers exposed to the polluted fume from welding at 61.11% while with the new sitting posture workers exposed to the polluted fume from welding at 40.47%. 3) On accidental risk, with the traditional sitting posture, workers exposed to the accident from welding at 94.44% while with the new sitting posture workers exposed to the accident from welding at 62.54%.

Keywords: occupational health safety, environment at work, Monk’s food bowl, machine intelligence

Procedia PDF Downloads 421
538 Features of Composites Application in Shipbuilding

Authors: Valerii Levshakov, Olga Fedorova

Abstract:

Specific features of ship structures, made from composites, i.e. simultaneous shaping of material and structure, large sizes, complicated outlines and tapered thickness have defined leading role of technology, integrating test results from material science, designing and structural analysis. Main procedures of composite shipbuilding are contact molding, vacuum molding and winding. Now, the most demanded composite shipbuilding technology is the manufacture of structures from fiberglass and multilayer hybrid composites by means of vacuum molding. This technology enables the manufacture of products with improved strength properties (in comparison with contact molding), reduction of production duration, weight and secures better environmental conditions in production area. Mechanized winding is applied for the manufacture of parts, shaped as rotary bodies – i.e. parts of ship, oil and other pipelines, deep-submergence vehicles hulls, bottles, reservoirs and other structures. This procedure involves processing of reinforcing fiberglass, carbon and polyaramide fibers. Polyaramide fibers have tensile strength of 5000 MPa, elastic modulus value of 130 MPa and rigidity of the same can be compared with rigidity of fiberglass, however, the weight of polyaramide fiber is 30% less than weight of fiberglass. The same enables to the manufacture different structures, including that, using both – fiberglass and organic composites. Organic composites are widely used for the manufacture of parts with size and weight limitations. High price of polyaramide fiber restricts the use of organic composites. Perspective area of winding technology development is the manufacture of carbon fiber shafts and couplings for ships. JSC ‘Shipbuilding & Shiprepair Technology Center’ (JSC SSTC) developed technology of dielectric uncouplers for cryogenic lines, cooled by gaseous or liquid cryogenic agents (helium, nitrogen, etc.) for temperature range 4.2-300 K and pressure up to 30 MPa – the same is used for separating components of electro physical equipment with different electrical potentials. Dielectric uncouplers were developed, the manufactured and tested in accordance with International Thermonuclear Experimental Reactor (ITER) Technical specification. Spiral uncouplers withstand operating voltage of 30 kV, direct-flow uncoupler – 4 kV. Application of spiral channel instead of rectilinear enables increasing of breakdown potential and reduction of uncouplers sizes. 95 uncouplers were successfully the manufactured and tested. At the present time, Russian the manufacturers of ship composite structures have started absorption of technology of manufacturing the same using automated prepreg laminating; this technology enables the manufacture of structures with improved operational specifications.

Keywords: fiberglass, infusion, polymeric composites, winding

Procedia PDF Downloads 218
537 Differentiated Surgical Treatment of Patients With Nontraumatic Intracerebral Hematomas

Authors: Mansur Agzamov, Valery Bersnev, Natalia Ivanova, Istam Agzamov, Timur Khayrullaev, Yulduz Agzamova

Abstract:

Objectives. Treatment of hypertensive intracerebral hematoma (ICH) is controversial. Advantage of one surgical method on other has not been established. Recent reports suggest a favorable effect of minimally invasive surgery. We conducted a small comparative study of different surgical methods. Methods. We analyzed the result of surgical treatment of 176 patients with intracerebral hematomas at the age from 41 to 78 years. Men were been113 (64.2%), women - 63 (35.8%). Level of consciousness: conscious -18, lethargy -63, stupor –55, moderate coma - 40. All patients on admission and in the dynamics underwent computer tomography (CT) of the brain. ICH was located in the putamen in 87 cases, thalamus in 19, in the mix area in 50, in the lobar area in 20. Ninety seven patients of them had an intraventricular hemorrhage component. The baseline volume of the ICH was measured according to a bedside method of measuring CT intracerebral hematomas volume. Depending on the intervention of the patients were divided into three groups. Group 1 patients, 90 patients, operated open craniotomy. Level of consciousness: conscious-11, lethargy-33, stupor–18, moderate coma -18. The hemorrhage was located in the putamen in 51, thalamus in 3, in the mix area in 25, in the lobar area in 11. Group 2 patients, 22 patients, underwent smaller craniotomy with endoscopic-assisted evacuation. Level of consciousness: conscious-4, lethargy-9, stupor–5, moderate coma -4. The hemorrhage was located in the putamen in 5, thalamus in 15, in the mix area in 2. Group 3 patients, 64 patients, was conducted minimally invasive removal of intracerebral hematomas using the original device (patent of Russian Federation № 65382). The device - funnel cannula - which after the special markings introduced into the hematoma cavity. Level of consciousness: conscious-3, lethargy-21, stupor–22, moderate coma -18. The hemorrhage was located in the putamen in 31, in the mix area in 23, thalamus in 1, in the lobar area in 9. Results of treatment were evaluated by Glasgow outcome scale. Results. The study showed that the results of surgical treatment in three groups depending on the degree of consciousness, the volume and localization of hematoma. In group 1, good recovery observed in 8 cases (8.9%), moderate disability in 22 (24.4%), severe disability - 17 (18.9%), death-43 (47.8%). In group 2, good recovery observed in 7 cases (31.8%), moderate disability in 7 (31.8%), severe disability - 5 (29.7%), death-7 (31.8%). In group 3, good recovery was observed in 9 cases (14.1%), moderate disability-17 (26.5%), severe disability-19 (29.7%), death-19 (29.7%). Conclusions. The method of using cannulae allowed to abandon from open craniotomy of the majority of patients with putaminal hematomas. Minimally invasive technique reduced the postoperative mortality and improves treatment outcomes of these patients.

Keywords: nontraumatic intracerebral hematoma, minimal invasive surgical technique, funnel canula, differentiated surcical treatment

Procedia PDF Downloads 63
536 Shaping Lexical Concept of 'Mage' through Image Schemas in Dragon Age 'Origins'

Authors: Dean Raiyasmi, Elvi Citraresmana, Sutiono Mahdi

Abstract:

Language shapes the human mind and its concept toward things. Using image schemas, in nowadays technology, even AI (artificial intelligence) can concept things in response to their creator negativity or positivity. This is reflected inside one of the most selling game around the world in 2012 called Dragon Age Origins. The AI in form of NPC (Non-Playable Character) inside the game reflects on the creator of the game on negativity or positivity toward the lexical concept of mage. Through image schemas, shaping the lexical concept of mage deemed possible and proved the negativity or positivity creator of the game toward mage. This research analyses the cognitive-semantic process of image schema and shaping the concept of ‘mage’ by describing kinds of image schemas exist in the Dragon Age Origin Game. This research is also aimed to analyse kinds of image schemas and describing the image schemas which shaping the concept of ‘mage’ itself. The methodology used in this research is qualitative where participative observation is employed with five stages and documentation. The results shows that there are four image schemas exist in the game and those image schemas shaping the lexical concept of ‘mage’.

Keywords: cognitive semantic, image-schema, conceptual metaphor, video game

Procedia PDF Downloads 417
535 The Design of Intelligent Passenger Organization System for Metro Stations Based on Anylogic

Authors: Cheng Zeng, Xia Luo

Abstract:

Passenger organization has always been an essential part of China's metro operation and management. Facing the massive passenger flow, stations need to improve their intelligence and automation degree by an appropriate integrated system. Based on the existing integrated supervisory control system (ISCS) and simulation software (Anylogic), this paper designs an intelligent passenger organization system (IPOS) for metro stations. Its primary function includes passenger information acquisition, data processing and computing, visualization management, decision recommendations, and decision response based on interlocking equipment. For this purpose, the logical structure and intelligent algorithms employed are particularly devised. Besides, the structure diagram of information acquisition and application module, the application of Anylogic, the case library's function process are all given by this research. Based on the secondary development of Anylogic and existing technologies like video recognition, the IPOS is supposed to improve the response speed and address capacity in the face of emergent passenger flow of metro stations.

Keywords: anylogic software, decision-making support system, intellectualization, ISCS, passenger organization

Procedia PDF Downloads 152
534 A Comparative Analysis of Grade Weighted Average and Comprehensive Examination Result of Non Board Passers and Board Passers

Authors: Rob Gesley Capistrano, Jasper James Isaac, Rose Mae Moralda, Therese Anne Peleo, Danica Rillo, Maria Virginia Santillian

Abstract:

One of the valuable things that shows the intelligence among individuals is the academic background specifically their Grade Weighted Average and the significant result of the Comprehensive Examination. The general objective of the researchers to this study is to determine if there is a significant difference between General Weighted Average and Comprehensive Examination Result of Psychometrician Board Passers and Non-Board Passers. The respondents of this study composed of board passers and non-board passers. The researchers used purposive sampling technique. The result utilized by using T-test Independent Sample to determine the comparison of General Weighted Average and Comprehensive Examination Result of Board Passers and Non Board Passers. At the end, it concluded that the General Weighted Average of Board Passers and Non-Board Passers shows that there is no significant difference, but the average showed a minimal variation. The Comprehensive Examination Result of Board Passers and Non-Board Passers result revealed that there is a significant difference. The performance of comprehensive examination that will test the overall knowledge of an individual and will determine whose more proficient will likely to have a higher score. The result of the comprehensive examination had an impact in the passing performance of board examination.

Keywords: board passers, comprehensive examination result, grade weighted average, non board passers

Procedia PDF Downloads 160
533 Restructurasation of the Concept of Empire in the Social Consciousness of Modern Americans

Authors: Maxim Kravchenko

Abstract:

The paper looks into the structure and contents of the concept of empire in the social consciousness of modern Americans. To construct the model of this socially and politically relevant concept we have conducted an experiment with respondents born and living in the USA. Empire is seen as a historic notion describing such entities as the British empire, the Russian empire, the Ottoman empire and others. It seems that the democratic regime adopted by most countries worldwide is incompatible with imperial status of a country. Yet there are countries which tend to dominate in the contemporary world and though they are not routinely referred to as empires, in many respects they are reminiscent of historical empires. Thus, the central hypothesis of the study is that the concept of empire is cultivated in some states through the intermediary of the mass media though it undergoes a certain transformation to meet the expectations of a democratic society. The transformation implies that certain components which were historically embedded in its structure are drawn to the margins of the hierarchical structure of the concept whereas other components tend to become central to the concept. This process can be referred to as restructuration of the concept of empire. To verify this hypothesis we have conducted a study which falls into two stages. First we looked into the definition of empire featured in dictionaries, the dominant conceptual components of empire are: importance, territory/lands, recognition, independence, authority/power, supreme/absolute. However, the analysis of 100 articles from American newspapers chosen at random revealed that authors rarely use the word «empire» in its basic meaning (7%). More often «empire» is used when speaking about countries, which no longer exist or when speaking about some corporations (like Apple or Google). At the second stage of the study we conducted an associative experiment with the citizens of the USA aged 19 to 45. The purpose of the experiment was to find out the dominant components of the concept of empire and to construct the model of the transformed concept. The experiment stipulated that respondents should give the first association, which crosses their mind, on reading such stimulus phrases as “strong military”, “strong economy” and others. The list of stimuli features various words and phrases associated with empire including the words representing the dominant components of the concept of empire. Then the associations provided by the respondents were classified into thematic clusters. For instance, the associations to the stimulus “strong military” were compartmentalized into three groups: 1) a country with strong military forces (North Korea, the USA, Russia, China); 2) negative impression of strong military (war, anarchy, conflict); 3) positive impression of strong military (peace, safety, responsibility). The experiment findings suggest that the concept of empire is currently undergoing a transformation which brings about a number of changes. Among them predominance of positively assessed components of the concept; emergence of two poles in the structure of the concept, that is “hero” vs. “enemy”; marginalization of any negatively assessed components.

Keywords: associative experiment, conceptual components, empire, restructurasation of the concept

Procedia PDF Downloads 292
532 An Analysis of Uncoupled Designs in Chicken Egg

Authors: Pratap Sriram Sundar, Chandan Chowdhury, Sagar Kamarthi

Abstract:

Nature has perfected her designs over 3.5 billion years of evolution. Research fields such as biomimicry, biomimetics, bionics, bio-inspired computing, and nature-inspired designs have explored nature-made artifacts and systems to understand nature’s mechanisms and intelligence. Learning from nature, the researchers have generated sustainable designs and innovation in a variety of fields such as energy, architecture, agriculture, transportation, communication, and medicine. Axiomatic design offers a method to judge if a design is good. This paper analyzes design aspects of one of the nature’s amazing object: chicken egg. The functional requirements (FRs) of components of the object are tabulated and mapped on to nature-chosen design parameters (DPs). The ‘independence axiom’ of the axiomatic design methodology is applied to analyze couplings and to evaluate if eggs’ design is good (i.e., uncoupled design) or bad (i.e., coupled design). The analysis revealed that eggs design is a good design, i.e., uncoupled design. This approach can be applied to any nature’s artifacts to judge whether their design is a good or a bad. This methodology is valuable for biomimicry studies. This approach can also be a very useful teaching design consideration of biology and bio-inspired innovation.

Keywords: uncoupled design, axiomatic design, nature design, design evaluation

Procedia PDF Downloads 153
531 The Solid-Phase Sensor Systems for Fluorescent and SERS-Recognition of Neurotransmitters for Their Visualization and Determination in Biomaterials

Authors: Irina Veselova, Maria Makedonskaya, Olga Eremina, Alexandr Sidorov, Eugene Goodilin, Tatyana Shekhovtsova

Abstract:

Such catecholamines as dopamine, norepinephrine, and epinephrine are the principal neurotransmitters in the sympathetic nervous system. Catecholamines and their metabolites are considered to be important markers of socially significant diseases such as atherosclerosis, diabetes, coronary heart disease, carcinogenesis, Alzheimer's and Parkinson's diseases. Currently, neurotransmitters can be studied via electrochemical and chromatographic techniques that allow their characterizing and quantification, although these techniques can only provide crude spatial information. Besides, the difficulty of catecholamine determination in biological materials is associated with their low normal concentrations (~ 1 nM) in biomaterials, which may become even one more order lower because of some disorders. In addition, in blood they are rapidly oxidized by monoaminooxidases from thrombocytes and, for this reason, the determination of neurotransmitter metabolism indicators in an organism should be very rapid (15—30 min), especially in critical states. Unfortunately, modern instrumental analysis does not offer a complex solution of this problem: despite its high sensitivity and selectivity, HPLC-MS cannot provide sufficiently rapid analysis, while enzymatic biosensors and immunoassays for the determination of the considered analytes lack sufficient sensitivity and reproducibility. Fluorescent and SERS-sensors remain a compelling technology for approaching the general problem of selective neurotransmitter detection. In recent years, a number of catecholamine sensors have been reported including RNA aptamers, fluorescent ribonucleopeptide (RNP) complexes, and boronic acid based synthetic receptors and the sensor operated in a turn-off mode. In this work we present the fluorescent and SERS turn-on sensor systems based on the bio- or chemorecognizing nanostructured films {chitosan/collagen-Tb/Eu/Cu-nanoparticles-indicator reagents} that provide the selective recognition, visualization, and sensing of the above mentioned catecholamines on the level of nanomolar concentrations in biomaterials (cell cultures, tissue etc.). We have (1) developed optically transparent porous films and gels of chitosan/collagen; (2) ensured functionalization of the surface by molecules-'recognizers' (by impregnation and immobilization of components of the indicator systems: biorecognizing and auxiliary reagents); (3) performed computer simulation for theoretical prediction and interpretation of some properties of the developed materials and obtained analytical signals in biomaterials. We are grateful for the financial support of this research from Russian Foundation for Basic Research (grants no. 15-03-05064 a, and 15-29-01330 ofi_m).

Keywords: biomaterials, fluorescent and SERS-recognition, neurotransmitters, solid-phase turn-on sensor system

Procedia PDF Downloads 383
530 A Two-Stage Airport Ground Movement Speed Profile Design Methodology Using Particle Swarm Optimization

Authors: Zhang Tianci, Ding Meng, Zuo Hongfu, Zeng Lina, Sun Zejun

Abstract:

Automation of airport operations can greatly improve ground movement efficiency. In this paper, we study the speed profile design problem for advanced airport ground movement control and guidance. The problem is constrained by the surface four-dimensional trajectory generated in taxi planning. A decomposed approach of two stages is presented to solve this problem efficiently. In the first stage, speeds are allocated at control points which ensure smooth speed profiles can be found later. In the second stage, detailed speed profiles of each taxi interval are generated according to the allocated control point speeds with the objective of minimizing the overall fuel consumption. We present a swarm intelligence based algorithm for the first-stage problem and a discrete variable driven enumeration method for the second-stage problem since it only has a small set of discrete variables. Experimental results demonstrate the presented methodology performs well on real world speed profile design problems.

Keywords: airport ground movement, fuel consumption, particle swarm optimization, smoothness, speed profile design

Procedia PDF Downloads 561
529 Modern Proteomics and the Application of Machine Learning Analyses in Proteomic Studies of Chronic Kidney Disease of Unknown Etiology

Authors: Dulanjali Ranasinghe, Isuru Supasan, Kaushalya Premachandra, Ranjan Dissanayake, Ajith Rajapaksha, Eustace Fernando

Abstract:

Proteomics studies of organisms are considered to be significantly information-rich compared to their genomic counterparts because proteomes of organisms represent the expressed state of all proteins of an organism at a given time. In modern top-down and bottom-up proteomics workflows, the primary analysis methods employed are gel–based methods such as two-dimensional (2D) electrophoresis and mass spectrometry based methods. Machine learning (ML) and artificial intelligence (AI) have been used increasingly in modern biological data analyses. In particular, the fields of genomics, DNA sequencing, and bioinformatics have seen an incremental trend in the usage of ML and AI techniques in recent years. The use of aforesaid techniques in the field of proteomics studies is only beginning to be materialised now. Although there is a wealth of information available in the scientific literature pertaining to proteomics workflows, no comprehensive review addresses various aspects of the combined use of proteomics and machine learning. The objective of this review is to provide a comprehensive outlook on the application of machine learning into the known proteomics workflows in order to extract more meaningful information that could be useful in a plethora of applications such as medicine, agriculture, and biotechnology.

Keywords: proteomics, machine learning, gel-based proteomics, mass spectrometry

Procedia PDF Downloads 134
528 Investigating Best Strategies Towards Creating Alternative Assessment in Literature

Authors: Sandhya Rao Mehta

Abstract:

As ChatGpt and other Artificial Intelligence (AI) forms are becoming part of our regular academic world, the consequences are being gradually discussed. The extent to which an essay written by a student is itself of any value if it has been downloaded by some form of AI is perhaps central to this discourse. A larger question is whether writing should be taught as an academic skill at all. In literature classrooms, this has major consequences as writing a traditional paper is still the single most preferred form of assessment. This study suggests that it is imperative to investigate alternative forms of assessment in literature, not only because the existing forms can be written by AI, but in a larger sense, students are increasingly skeptical of the purpose of such work. The extent to which an essay actually helps the students professionally is a question that academia has not yet answered. This paper suggests that using real-world tasks like creating podcasts, video tutorials, and websites is a far better way to evaluate students' critical thinking and application of ideas, as well as to develop digital skills which are important to their future careers. Using the example of a course in literature, this study will examine the possibilities and challenges of creating digital projects as a way of confronting the complexities of student evaluation in the future. The study is based on a specific university English as a Foreign Language (EFL) context.

Keywords: assessment, literature, digital humanities, chatgpt

Procedia PDF Downloads 66
527 Low Probability of Intercept (LPI) Signal Detection and Analysis Using Choi-Williams Distribution

Authors: V. S. S. Kumar, V. Ramya

Abstract:

In the modern electronic warfare, the signal scenario is changing at a rapid pace with the introduction of Low Probability of Intercept (LPI) radars. In the modern battlefield, radar system faces serious threats from passive intercept receivers such as Electronic Attack (EA) and Anti-Radiation Missiles (ARMs). To perform necessary target detection and tracking and simultaneously hide themselves from enemy attack, radar systems should be LPI. These LPI radars use a variety of complex signal modulation schemes together with pulse compression with the aid of advancement in signal processing capabilities of the radar such that the radar performs target detection and tracking while simultaneously hiding enemy from attack such as EA etc., thus posing a major challenge to the ES/ELINT receivers. Today an increasing number of LPI radars are being introduced into the modern platforms and weapon systems so these LPI radars created a requirement for the armed forces to develop new techniques, strategies and equipment to counter them. This paper presents various modulation techniques used in generation of LPI signals and development of Time Frequency Algorithms to analyse those signals.

Keywords: anti-radiation missiles, cross terms, electronic attack, electronic intelligence, electronic warfare, intercept receiver, low probability of intercept

Procedia PDF Downloads 437
526 A Method for False Alarm Recognition Based on Multi-Classification Support Vector Machine

Authors: Weiwei Cui, Dejian Lin, Leigang Zhang, Yao Wang, Zheng Sun, Lianfeng Li

Abstract:

Built-in test (BIT) is an important technology in testability field, and it is widely used in state monitoring and fault diagnosis. With the improvement of modern equipment performance and complexity, the scope of BIT becomes larger, and it leads to the emergence of false alarm problem. The false alarm makes the health assessment unstable, and it reduces the effectiveness of BIT. The conventional false alarm suppression methods such as repeated test and majority voting cannot meet the requirement for a complicated system, and the intelligence algorithms such as artificial neural networks (ANN) are widely studied and used. However, false alarm has a very low frequency and small sample, yet a method based on ANN requires a large size of training sample. To recognize the false alarm, we propose a method based on multi-classification support vector machine (SVM) in this paper. Firstly, we divide the state of a system into three states: healthy, false-alarm, and faulty. Then we use multi-classification with '1 vs 1' policy to train and recognize the state of a system. Finally, an example of fault injection system is taken to verify the effectiveness of the proposed method by comparing ANN. The result shows that the method is reasonable and effective.

Keywords: false alarm, fault diagnosis, SVM, k-means, BIT

Procedia PDF Downloads 136
525 Framework for Integrating Big Data and Thick Data: Understanding Customers Better

Authors: Nikita Valluri, Vatcharaporn Esichaikul

Abstract:

With the popularity of data-driven decision making on the rise, this study focuses on providing an alternative outlook towards the process of decision-making. Combining quantitative and qualitative methods rooted in the social sciences, an integrated framework is presented with a focus on delivering a much more robust and efficient approach towards the concept of data-driven decision-making with respect to not only Big data but also 'Thick data', a new form of qualitative data. In support of this, an example from the retail sector has been illustrated where the framework is put into action to yield insights and leverage business intelligence. An interpretive approach to analyze findings from both kinds of quantitative and qualitative data has been used to glean insights. Using traditional Point-of-sale data as well as an understanding of customer psychographics and preferences, techniques of data mining along with qualitative methods (such as grounded theory, ethnomethodology, etc.) are applied. This study’s final goal is to establish the framework as a basis for providing a holistic solution encompassing both the Big and Thick aspects of any business need. The proposed framework is a modified enhancement in lieu of traditional data-driven decision-making approach, which is mainly dependent on quantitative data for decision-making.

Keywords: big data, customer behavior, customer experience, data mining, qualitative methods, quantitative methods, thick data

Procedia PDF Downloads 138
524 Artificial Intelligent Tax Simulator to Minimize Tax Liability for Multinational Corporations

Authors: Sean Goltz, Michael Mayo

Abstract:

The purpose of this research is to use Global-Regulation.com database of the world laws, focusing on tax treaties between countries, in order to create an AI-driven tax simulator that will run an AI agent through potential tax scenarios across countries. The AI agent goal is to identify the scenario that will result in minimum tax liability based on tax treaties between countries. The results will be visualized by a three dimensional matrix. This will be an online web application. Multinational corporations are running their business through multiple countries. These countries, in turn, have a tax treaty with many other countries to regulate the payment of taxes on income that is transferred between these countries. As a result, planning the best tax scenario across multiple countries and numerous tax treaties is almost impossible. This research propose to use Global-Regulation.com database of word laws in English (machine translated by Google and Microsoft API’s) in order to create a simulator that will include the information in the tax treaties. Once ready, an AI agent will be sent through the simulator to identify the scenario that will result in minimum tax liability. Identifying the best tax scenario across countries may save multinational corporations, like Google, billions of dollars annually. Given the nature of the raw data and the domain of taxes (i.e., numbers), this is a promising ground to employ artificial intelligence towards a practical and beneficial purpose.

Keywords: taxation, law, multinational, corporation

Procedia PDF Downloads 177
523 Designing a Patient Monitoring System Using Cloud and Semantic Web Technologies

Authors: Chryssa Thermolia, Ekaterini S. Bei, Stelios Sotiriadis, Kostas Stravoskoufos, Euripides G. M. Petrakis

Abstract:

Moving into a new era of healthcare, new tools and devices are developed to extend and improve health services, such as remote patient monitoring and risk prevention. In this concept, Internet of Things (IoT) and Cloud Computing present great advantages by providing remote and efficient services, as well as cooperation between patients, clinicians, researchers and other health professionals. This paper focuses on patients suffering from bipolar disorder, a brain disorder that belongs to a group of conditions called effective disorders, which is characterized by great mood swings.We exploit the advantages of Semantic Web and Cloud Technologies to develop a patient monitoring system to support clinicians. Based on intelligently filtering of evidence-knowledge and individual-specific information we aim to provide treatment notifications and recommended function tests at appropriate times or concluding into alerts for serious mood changes and patient’s non-response to treatment. We propose an architecture, as the back-end part of a cloud platform for IoT, intertwining intelligence devices with patients’ daily routine and clinicians’ support.

Keywords: bipolar disorder, intelligent systems patient monitoring, semantic web technologies, healthcare

Procedia PDF Downloads 487
522 Convolutional Neural Network and LSTM Applied to Abnormal Behaviour Detection from Highway Footage

Authors: Rafael Marinho de Andrade, Elcio Hideti Shiguemori, Rafael Duarte Coelho dos Santos

Abstract:

Relying on computer vision, many clever things are possible in order to make the world safer and optimized on resource management, especially considering time and attention as manageable resources, once the modern world is very abundant in cameras from inside our pockets to above our heads while crossing the streets. Thus, automated solutions based on computer vision techniques to detect, react, or even prevent relevant events such as robbery, car crashes and traffic jams can be accomplished and implemented for the sake of both logistical and surveillance improvements. In this paper, we present an approach for vehicles’ abnormal behaviors detection from highway footages, in which the vectorial data of the vehicles’ displacement are extracted directly from surveillance cameras footage through object detection and tracking with a deep convolutional neural network and inserted into a long-short term memory neural network for behavior classification. The results show that the classifications of behaviors are consistent and the same principles may be applied to other trackable objects and scenarios as well.

Keywords: artificial intelligence, behavior detection, computer vision, convolutional neural networks, LSTM, highway footage

Procedia PDF Downloads 141
521 Linking Excellence in Biomedical Knowledge and Computational Intelligence Research for Personalized Management of Cardiovascular Diseases within Personal Health Care

Authors: T. Rocha, P. Carvalho, S. Paredes, J. Henriques, A. Bianchi, V. Traver, A. Martinez

Abstract:

The main goal of LINK project is to join competences in intelligent processing in order to create a research ecosystem to address two central scientific and technical challenges for personal health care (PHC) deployment: i) how to merge clinical evidence knowledge in computational decision support systems for PHC management and ii) how to provide achieve personalized services, i.e., solutions adapted to the specific user needs and characteristics. The final goal of one of the work packages (WP2), designated Sustainable Linking and Synergies for Excellence, is the definition, implementation and coordination of the necessary activities to create and to strengthen durable links between the LiNK partners. This work focuses on the strategy that has been followed to achieve the definition of the Research Tracks (RT), which will support a set of actions to be pursued along the LiNK project. These include common research activities, knowledge transfer among the researchers of the consortium, and PhD student and post-doc co-advisement. Moreover, the RTs will establish the basis for the definition of concepts and their evolution to project proposals.

Keywords: LiNK Twin European Project, personal health care, cardiovascular diseases, research tracks

Procedia PDF Downloads 203
520 Energy Efficient Clustering with Adaptive Particle Swarm Optimization

Authors: KumarShashvat, ArshpreetKaur, RajeshKumar, Raman Chadha

Abstract:

Wireless sensor networks have principal characteristic of having restricted energy and with limitation that energy of the nodes cannot be replenished. To increase the lifetime in this scenario WSN route for data transmission is opted such that utilization of energy along the selected route is negligible. For this energy efficient network, dandy infrastructure is needed because it impinges the network lifespan. Clustering is a technique in which nodes are grouped into disjoints and non–overlapping sets. In this technique data is collected at the cluster head. In this paper, Adaptive-PSO algorithm is proposed which forms energy aware clusters by minimizing the cost of locating the cluster head. The main concern is of the suitability of the swarms by adjusting the learning parameters of PSO. Particle Swarm Optimization converges quickly at the beginning stage of the search but during the course of time, it becomes stable and may be trapped in local optima. In suggested network model swarms are given the intelligence of the spiders which makes them capable enough to avoid earlier convergence and also help them to escape from the local optima. Comparison analysis with traditional PSO shows that new algorithm considerably enhances the performance where multi-dimensional functions are taken into consideration.

Keywords: Particle Swarm Optimization, adaptive – PSO, comparison between PSO and A-PSO, energy efficient clustering

Procedia PDF Downloads 226
519 Survival Struggle: To Be a Female Competitor in Survivor

Authors: Gülbuğ Erol, Gamze Beyge, Hakan Ekemen

Abstract:

In Turkey national TV channels broadcast a wide range of programs to audience attract viewers. Since the year 2000, especially the competition programs were directed towards entertainment and audience has gained. Even today, television channels have just begun to be broadcast on entertainment channels. Except from the news, the TV collects pleasure with its broadcasts aiming to meet the expectation of the Turkish people of TV 8 TV channels. Survivor, one of the TV 8 programs, draws attention with the ratings it receives and the broad target audience it addresses. Survivor, however, is one of the most exciting competitions on the Turkish television scene, which is rightly and ambitiously competitive in television contest programs. It is a format in which women and men struggle their power borders by winning the competition with their names thanks to their intelligence and endurance games. The contestants of the program, which has been running since March 22, 2005, are seen in a platform where they must present their struggle for their various awards. In Survivor, where competition is at stake, courage and strength are reduced by the reduction of sex. In this study, the critical discourse was made taking into consideration the challenges of female competitors competing to the final stage which is behind the male competitors. Secondly, the variables from the beginning to the present day of the adaptation of the judge to Turkey have been debated in a critical context.

Keywords: television, meaning, discourse, contest program

Procedia PDF Downloads 207
518 Arc Plasma Application for Solid Waste Processing

Authors: Vladimir Messerle, Alfred Mosse, Alexandr Ustimenko, Oleg Lavrichshev

Abstract:

Hygiene and sanitary study of typical medical-biological waste made in Kazakhstan, Russia, Belarus and other countries show that their risk to the environment is much higher than that of most chemical wastes. For example, toxicity of solid waste (SW) containing cytotoxic drugs and antibiotics is comparable to toxicity of radioactive waste of high and medium level activity. This report presents the results of the thermodynamic analysis of thermal processing of SW and experiments at the developed plasma unit for SW processing. Thermodynamic calculations showed that the maximum yield of the synthesis gas at plasma gasification of SW in air and steam mediums is achieved at a temperature of 1600K. At the air plasma gasification of SW high-calorific synthesis gas with a concentration of 82.4% (СO – 31.7%, H2 – 50.7%) can be obtained, and at the steam plasma gasification – with a concentration of 94.5% (СO – 33.6%, H2 – 60.9%). Specific heat of combustion of the synthesis gas produced by air gasification amounts to 14267 kJ/kg, while by steam gasification - 19414 kJ/kg. At the optimal temperature (1600 K), the specific power consumption for air gasification of SW constitutes 1.92 kWh/kg, while for steam gasification - 2.44 kWh/kg. Experimental study was carried out in a plasma reactor. This is device of periodic action. The arc plasma torch of 70 kW electric power is used for SW processing. Consumption of SW was 30 kg/h. Flow of plasma-forming air was 12 kg/h. Under the influence of air plasma flame weight average temperature in the chamber reaches 1800 K. Gaseous products are taken out of the reactor into the flue gas cooling unit, and the condensed products accumulate in the slag formation zone. The cooled gaseous products enter the gas purification unit, after which via gas sampling system is supplied to the analyzer. Ventilation system provides a negative pressure in the reactor up to 10 mm of water column. Condensed products of SW processing are removed from the reactor after its stopping. By the results of experiments on SW plasma gasification the reactor operating conditions were determined, the exhaust gas analysis was performed and the residual carbon content in the slag was determined. Gas analysis showed the following composition of the gas at the exit of gas purification unit, (vol.%): СO – 26.5, H2 – 44.6, N2–28.9. The total concentration of the syngas was 71.1%, which agreed well with the thermodynamic calculations. The discrepancy between experiment and calculation by the yield of the target syngas did not exceed 16%. Specific power consumption for SW gasification in the plasma reactor according to the results of experiments amounted to 2.25 kWh/kg of working substance. No harmful impurities were found in both gas and condensed products of SW plasma gasification. Comparison of experimental results and calculations showed good agreement. Acknowledgement—This work was supported by Ministry of Education and Science of the Republic of Kazakhstan and Ministry of Education and Science of the Russian Federation (Agreement on grant No. 14.607.21.0118, project RFMEF160715X0118).

Keywords: coal, efficiency, ignition, numerical modeling, plasma-fuel system, plasma generator

Procedia PDF Downloads 236
517 Upgrade of Value Chains and the Effect on Resilience of Russia’s Coal Industry and Receiving Regions on the Path of Energy Transition

Authors: Sergey Nikitenko, Vladimir Klishin, Yury Malakhov, Elena Goosen

Abstract:

Transition to renewable energy sources (solar, wind, bioenergy, etc.) and launching of alternative energy generation has weakened the role of coal as a source of energy. The Paris Agreement and assumption of obligations by many nations to orderly reduce CO₂ emissions by means of technological modernization and climate change adaptation has abridged coal demand yet more. This paper aims to assess current resilience of the coal industry to stress and to define prospects for coal production optimization using high technologies pursuant to global challenges and requirements of energy transition. Our research is based on the resilience concept adapted to the coal industry. It is proposed to divide the coal sector into segments depending on the prevailing value chains (VC). Four representative models of VC are identified in the coal sector. The most promising lines of upgrading VC in the coal industry include: •Elongation of VC owing to introduction of clean technologies of coal conversion and utilization; •Creation of parallel VC by means of waste management; •Branching of VC (conversion of a company’s VC into a production network). The upgrade effectiveness is governed in many ways by applicability of advanced coal processing technologies, usability of waste, expandability of production, entrance to non-rival markets and localization of new segments of VC in receiving regions. It is also important that upgrade of VC by means of formation of agile high-tech inter-industry production networks within the framework of operating surface and underground mines can reduce social, economic and ecological risks associated with closure of coal mines. Such promising route of VC upgrade is application of methanotrophic bacteria to produce protein to be used as feed-stuff in fish, poultry and cattle breeding, or in production of ferments, lipoids, sterols, antioxidants, pigments and polysaccharides. Closed mines can use recovered methane as a clean energy source. There exist methods of methane utilization from uncontrollable sources, including preliminary treatment and recovery of methane from air-and-methane mixture, or decomposition of methane to hydrogen and acetylene. Separated hydrogen is used in hydrogen fuel cells to generate power to feed the process of methane utilization and to supply external consumers. Despite the recent paradigm of carbon-free energy generation, it is possible to preserve the coal mining industry using the differentiated approach to upgrade of value chains based on flexible technologies with regard to specificity of mining companies.

Keywords: resilience, resilience concept, resilience indicator, resilience in the Russian coal industry, value chains

Procedia PDF Downloads 87
516 The Impact of Artificial Intelligence on Textiles Technology

Authors: Ramy Kamel Fekrey Gadelrab

Abstract:

Textile sensors have gained a lot of interest in recent years as it is instrumental in monitoring physiological and environmental changes, for a better diagnosis that can be useful in various fields like medical textiles, sports textiles, protective textiles, agro textiles, and geo-textiles. Moreover, with the development of flexible textile-based wearable sensors, the functionality of smart clothing is augmented for a more improved user experience when it comes to technical textiles. In this context, conductive textiles using new composites and nanomaterials are being developed while considering its compatibility with the textile manufacturing processes. This review aims to provide a comprehensive and detailed overview of the contemporary advancements in textile-based wearable physical sensors, used in the field of medical, security, surveillance, and protection, from a global perspective. The methodology used is through analysing various examples of integration of wearable textile-based sensors with clothing for daily use, keeping in mind the technological advances in the same. By comparing various case studies, it come across various challenges textile sensors, in terms of stability, the comfort of movement, and reliable sensing components to enable accurate measurements, in spite of progress in the engineering of the wearable. Addressing such concerns is critical for the future success of wearable sensors.

Keywords: nanoparticles, enzymes, immobilization, textilesconductive yarn, e-textiles, smart textiles, thermal analysisflexible textile-based wearable sensors, contemporary advancements, conductive textiles, body conformal design

Procedia PDF Downloads 23
515 Content Monetization as a Mark of Media Economy Quality

Authors: Bela Lebedeva

Abstract:

Characteristics of the Web as a channel of information dissemination - accessibility and openness, interactivity and multimedia news - become wider and cover the audience quickly, positively affecting the perception of content, but blur out the understanding of the journalistic work. As a result audience and advertisers continue migrating to the Internet. Moreover, online targeting allows monetizing not only the audience (as customarily given to traditional media) but also the content and traffic more accurately. While the users identify themselves with the qualitative characteristics of the new market, its actors are formed. Conflict of interests is laid in the base of the economy of their relations, the problem of traffic tax as an example. Meanwhile, content monetization actualizes fiscal interest of the state too. The balance of supply and demand is often violated due to the political risks, particularly in terms of state capitalism, populism and authoritarian methods of governance such social institutions as the media. A unique example of access to journalistic material, limited by monetization of content is a television channel Dozhd' (Rain) in Russian web space. Its liberal-minded audience has a better possibility for discussion. However, the channel could have been much more successful in terms of unlimited free speech. Avoiding state pressure and censorship its management has decided to save at least online performance and monetizing all of the content for the core audience. The study Methodology was primarily based on the analysis of journalistic content, on the qualitative and quantitative analysis of the audience. Reconstructing main events and relationships of actors on the market for the last six years researcher has reached some conclusions. First, under the condition of content monetization the capitalization of its quality will always strive to quality characteristics of user, thereby identifying him. Vice versa, the user's demand generates high-quality journalism. The second conclusion follows the previous one. The growth of technology, information noise, new political challenges, the economy volatility and the cultural paradigm change – all these factors form the content paying model for an individual user. This model defines him as a beneficiary of specific knowledge and indicates the constant balance of supply and demand other conditions being equal. As a result, a new economic quality of information is created. This feature is an indicator of the market as a self-regulated system. Monetized information quality is less popular than that of the Public Broadcasting Service, but this audience is able to make decisions. These very users keep the niche sectors which have more potential of technology development, including the content monetization ways. The third point of the study allows develop it in the discourse of media space liberalization. This cultural phenomenon may open opportunities for the development of social and economic relations architecture both locally and regionally.

Keywords: content monetization, state capitalism, media liberalization, media economy, information quality

Procedia PDF Downloads 226
514 Development of Fuzzy Logic Control Ontology for E-Learning

Authors: Muhammad Sollehhuddin A. Jalil, Mohd Ibrahim Shapiai, Rubiyah Yusof

Abstract:

Nowadays, ontology is common in many areas like artificial intelligence, bioinformatics, e-commerce, education and many more. Ontology is one of the focus areas in the field of Information Retrieval. The purpose of an ontology is to describe a conceptual representation of concepts and their relationships within a particular domain. In other words, ontology provides a common vocabulary for anyone who needs to share information in the domain. There are several ontology domains in various fields including engineering and non-engineering knowledge. However, there are only a few available ontology for engineering knowledge. Fuzzy logic as engineering knowledge is still not available as ontology domain. In general, fuzzy logic requires step-by-step guidelines and instructions of lab experiments. In this study, we presented domain ontology for Fuzzy Logic Control (FLC) knowledge. We give Table of Content (ToC) with middle strategy based on the Uschold and King method to develop FLC ontology. The proposed framework is developed using Protégé as the ontology tool. The Protégé’s ontology reasoner, known as the Pellet reasoner is then used to validate the presented framework. The presented framework offers better performance based on consistency and classification parameter index. In general, this ontology can provide a platform to anyone who needs to understand FLC knowledge.

Keywords: engineering knowledge, fuzzy logic control ontology, ontology development, table of content

Procedia PDF Downloads 277
513 [Keynote Talk]: Swiss Scientific Society for Developing Countries: A Concept of Relationship

Authors: Jawad Alzeer

Abstract:

Cultural setup is varied from country to country and nation to nation, but the ability to adapt successfully to the new cultural setup may pave the way toward the development of cultural intelligence. Overcoming differences may require to build up our personality with the ability to learn, exchange thoughts, and have a constructive dream. Adaptation processes can be accelerated if we effectively utilize our cultural diversity. This can be done through a unified body or society; people with common goals can collectively work to satisfy their values. Narrowing the gap between developed and developing countries is of prime importance. Many international organizations are trying to resolve these issues by rational and peaceful means. Failing to understand the cultural differences, mentalities, strengths and weaknesses of developed and developing countries led to the collapse of many partnerships. Establishment of a neutral body influenced by developed countries intellectuality and developing countries personality may offer a better understanding and reasonable solutions, suggestions, advice that may assist in narrowing gaps and promote-strengthening relationship between developed and developing countries. The key issues, goals, and potential concepts associated with initiating Swiss scientific society for developing countries as a model to facilitate integration of highly skilled scientists are discussed.

Keywords: cultural diversity, developing countries, integration, Switzerland

Procedia PDF Downloads 788
512 Risk Mitigation of Data Causality Analysis Requirements AI Act

Authors: Raphaël Weuts, Mykyta Petik, Anton Vedder

Abstract:

Artificial Intelligence has the potential to create and already creates enormous value in healthcare. Prescriptive systems might be able to make the use of healthcare capacity more efficient. Such systems might entail interpretations that exclude the effect of confounders that brings risks with it. Those risks might be mitigated by regulation that prevents systems entailing such risks to come to market. One modality of regulation is that of legislation, and the European AI Act is an example of such a regulatory instrument that might mitigate these risks. To assess the risk mitigation potential of the AI Act for those risks, this research focusses on a case study of a hypothetical application of medical device software that entails the aforementioned risks. The AI Act refers to the harmonised norms for already existing legislation, here being the European medical device regulation. The issue at hand is a causal link between a confounder and the value the algorithm optimises for by proxy. The research identifies where the AI Act already looks at confounders (i.a. feedback loops in systems that continue to learn after being placed on the market). The research identifies where the current proposal by parliament leaves legal uncertainty on the necessity to check for confounders that do not influence the input of the system, when the system does not continue to learn after being placed on the market. The authors propose an amendment to article 15 of the AI Act that would require high-risk systems to be developed in such a way as to mitigate risks from those aforementioned confounders.

Keywords: AI Act, healthcare, confounders, risks

Procedia PDF Downloads 239
511 Determining Fire Resistance of Wooden Construction Elements through Experimental Studies and Artificial Neural Network

Authors: Sakir Tasdemir, Mustafa Altin, Gamze Fahriye Pehlivan, Sadiye Didem Boztepe Erkis, Ismail Saritas, Selma Tasdemir

Abstract:

Artificial intelligence applications are commonly used in industry in many fields in parallel with the developments in the computer technology. In this study, a fire room was prepared for the resistance of wooden construction elements and with the mechanism here, the experiments of polished materials were carried out. By utilizing from the experimental data, an artificial neural network (ANN) was modeled in order to evaluate the final cross sections of the wooden samples remaining from the fire. In modelling, experimental data obtained from the fire room were used. In the system developed, the first weight of samples (ws-gr), preliminary cross-section (pcs-mm2), fire time (ft-minute), fire temperature (t-oC) as input parameters and final cross-section (fcs-mm2) as output parameter were taken. When the results obtained from ANN and experimental data are compared after making statistical analyses, the data of two groups are determined to be coherent and seen to have no meaning difference between them. As a result, it is seen that ANN can be safely used in determining cross sections of wooden materials after fire and it prevents many disadvantages.

Keywords: artificial neural network, final cross-section, fire retardant polishes, fire safety, wood resistance.

Procedia PDF Downloads 364
510 The Impact of Artificial Intelligence on Spare Parts Technology

Authors: Amir Andria Gad Shehata

Abstract:

Minimizing the inventory cost, optimizing the inventory quantities, and increasing system operational availability are the main motivations to enhance forecasting demand of spare parts in a major power utility company in Medina. This paper reports in an effort made to optimize the orders quantities of spare parts by improving the method of forecasting the demand. The study focuses on equipment that has frequent spare parts purchase orders with uncertain demand. The pattern of the demand considers a lumpy pattern which makes conventional forecasting methods less effective. A comparison was made by benchmarking various methods of forecasting based on experts’ criteria to select the most suitable method for the case study. Three actual data sets were used to make the forecast in this case study. Two neural networks (NN) approaches were utilized and compared, namely long short-term memory (LSTM) and multilayer perceptron (MLP). The results as expected, showed that the NN models gave better results than traditional forecasting method (judgmental method). In addition, the LSTM model had a higher predictive accuracy than the MLP model.

Keywords: spare part, spare part inventory, inventory model, optimization, maintenanceneural network, LSTM, MLP, forecasting demand, inventory management

Procedia PDF Downloads 35