Search results for: aesthetic computing
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1338

Search results for: aesthetic computing

228 Embedded System of Signal Processing on FPGA: Underwater Application Architecture

Authors: Abdelkader Elhanaoui, Mhamed Hadji, Rachid Skouri, Said Agounad

Abstract:

The purpose of this paper is to study the phenomenon of acoustic scattering by using a new method. The signal processing (Fast Fourier Transform FFT Inverse Fast Fourier Transform iFFT and BESSEL functions) is widely applied to obtain information with high precision accuracy. Signal processing has a wider implementation in general-purpose pro-cessors. Our interest was focused on the use of FPGAs (Field-Programmable Gate Ar-rays) in order to minimize the computational complexity in single processor architecture, then be accelerated on FPGA and meet real-time and energy efficiency requirements. Gen-eral-purpose processors are not efficient for signal processing. We implemented the acous-tic backscattered signal processing model on the Altera DE-SOC board and compared it to Odroid xu4. By comparison, the computing latency of Odroid xu4 and FPGA is 60 sec-onds and 3 seconds, respectively. The detailed SoC FPGA-based system has shown that acoustic spectra are performed up to 20 times faster than the Odroid xu4 implementation. FPGA-based system of processing algorithms is realized with an absolute error of about 10⁻³. This study underlines the increasing importance of embedded systems in underwater acoustics, especially in non-destructive testing. It is possible to obtain information related to the detection and characterization of submerged cells. So we have achieved good exper-imental results in real-time and energy efficiency.

Keywords: DE1 FPGA, acoustic scattering, form function, signal processing, non-destructive testing

Procedia PDF Downloads 53
227 Bioethanol Production from Wild Sorghum (Sorghum arundinacieum) and Spear Grass (Heteropogon contortus)

Authors: Adeyinka Adesanya, Isaac Bamgboye

Abstract:

There is a growing need to develop the processes to produce renewable fuels and chemicals due to the economic, political, and environmental concerns associated with fossil fuels. Lignocellulosic biomass is an excellent renewable feedstock because it is both abundant and inexpensive. This project aims at producing bioethanol from lignocellulosic plants (Sorghum Arundinacieum and Heteropogon Contortus) by biochemical means, computing the energy audit of the process and determining the fuel properties of the produced ethanol. Acid pretreatment (0.5% H2SO4 solution) and enzymatic hydrolysis (using malted barley as enzyme source) were employed. The ethanol yield of wild sorghum was found to be 20% while that of spear grass was 15%. The fuel properties of the bioethanol from wild sorghum are 1.227 centipoise for viscosity, 1.10 g/cm3 for density, 0.90 for specific gravity, 78 °C for boiling point and the cloud point was found to be below -30 °C. That of spear grass was 1.206 centipoise for viscosity, 0.93 g/cm3 for density 1.08 specific gravity, 78 °C for boiling point and the cloud point was also found to be below -30 °C. The energy audit shows that about 64 % of the total energy was used up during pretreatment, while product recovery which was done manually demanded about 31 % of the total energy. Enzymatic hydrolysis, fermentation, and distillation total energy input were 1.95 %, 1.49 % and 1.04 % respectively, the alcoholometric strength of bioethanol from wild sorghum was found to be 47 % and the alcoholometric strength of bioethanol from spear grass was 72 %. Also, the energy efficiency of the bioethanol production for both grasses was 3.85 %.

Keywords: lignocellulosic biomass, wild sorghum, spear grass, biochemical conversion

Procedia PDF Downloads 210
226 Digital Manufacturing: Evolution and a Process Oriented Approach to Align with Business Strategy

Authors: Abhimanyu Pati, Prabir K. Bandyopadhyay

Abstract:

The paper intends to highlight the significance of Digital Manufacturing (DM) strategy in support and achievement of business strategy and goals of any manufacturing organization. Towards this end, DM initiatives have been given a process perspective, while not undermining its technological significance, with a view to link its benefits directly with fulfilment of customer needs and expectations in a responsive and cost-effective manner. A digital process model has been proposed to categorize digitally enabled organizational processes with a view to create synergistic groups, which adopt and use digital tools having similar characteristics and functionalities. This will throw future opportunities for researchers and developers to create a unified technology environment for integration and orchestration of processes. Secondly, an effort has been made to apply “what” and “how” features of Quality Function Deployment (QFD) framework to establish the relationship between customers’ needs – both for external and internal customers, and the features of various digital processes, which support for the achievement of these customer expectations. The paper finally concludes that in the present highly competitive environment, business organizations cannot thrive to sustain unless they understand the significance of digital strategy and integrate it with their business strategy with a clearly defined implementation roadmap. A process-oriented approach to DM strategy will help business executives and leaders to appreciate its value propositions and its direct link to organization’s competitiveness.

Keywords: knowledge management, cloud computing, knowledge management approaches, cloud-based knowledge management

Procedia PDF Downloads 291
225 Cultural Identity of Mainland Chinese, Hongkonger and Taiwanese: A Glimpse from Hollywood Film Title Translation

Authors: Ling Yu Debbie Tsoi

Abstract:

After China has just exceeded the USA as the top Hollywood film market in 2018, Hollywood studios have been adapting the taste, preference, casting and even film title translation to resonate with the Chinese audience. Due to the huge foreign demands, Hollywood film directors are paying closer attention to the translation of their products, as film titles are entry gates to the film and serve advertising, informative, aesthetic functions. Other than film directors and studios, comments over quality film title translation also appear on various online clip viewing platforms, online media, and magazines. In particular, netizens in mainland China, Hong Kong, and Taiwan seems to defend film titles in their own region while despising the other two regions. In view of the endless debates and lack of systematic analysis on film title translation in Greater China, the study aims at investigating the translation of Hollywood film titles (from English to Chinese) across Greater China based on Venuti’s (1991; 1995; 1998; 2001) concept of domestication and foreignization. To offer a comparison over time, a mini-corpus was built comprised of the top 70 most popular Hollywood film titles in 1987- 1988, 1997- 1998, 2007- 2008 and 2017- 2018 of Greater China respectively. Altogether, 560 source texts and 1680 target texts of mainland China, Hong Kong, and Taiwan were compared against each other. The three regions are found to have a distinctive style and patterns of translation. For instance, a sizable number of film titles are foreignized in mainland China by adopting literal translation and transliteration, whereas Hong Kong and Taiwan prefer domestication. Hong Kong tends to adopt a more vulgar style by using colloquial Cantonese slangs and even swear words, associating characters with negative connotations. Also, English is used as a form of domestication in Hong Kong from 1987 till 2018. Use of English as a strategy of domestication was never found in mainland nor Taiwan. On the contrary, Taiwanese target texts tend to adopt a cute and child-like style by using repetitive words and positive connotations. Even if English was used, it was used as foreignization. As film titles represent cultural products of popular culture, it is suspected that Hongkongers would like to develop cultural identity via adopting style distinctive from mainland China by vulgarization and negativity. Hongkongers also identify themselves as international cosmopolitan, leading to their identification with English. It is also suspected that due to former colonial rule of Japan, Taiwan adopts a popular culture similar to Japan, with cute and childlike expressions.

Keywords: cultural identification, ethnic identification, Greater China, film title translation

Procedia PDF Downloads 123
224 Computerized Analysis of Phonological Structure of 10,400 Brazilian Sign Language Signs

Authors: Wanessa G. Oliveira, Fernando C. Capovilla

Abstract:

Capovilla and Raphael’s Libras Dictionary documents a corpus of 4,200 Brazilian Sign Language (Libras) signs. Duduchi and Capovilla’s software SignTracking permits users to retrieve signs even when ignoring the gloss corresponding to it and to discover the meaning of all 4,200 signs sign simply by clicking on graphic menus of the sign characteristics (phonemes). Duduchi and Capovilla have discovered that the ease with which any given sign can be retrieved is an inverse function of the average popularity of its component phonemes. Thus, signs composed of rare (distinct) phonemes are easier to retrieve than are those composed of common phonemes. SignTracking offers a means of computing the average popularity of the phonemes that make up each one of 4,200 signs. It provides a precise measure of the degree of ease with which signs can be retrieved, and sign meanings can be discovered. Duduchi and Capovilla’s logarithmic model proved valid: The degree with which any given sign can be retrieved is an inverse function of the arithmetic mean of the logarithm of the popularity of each component phoneme. Capovilla, Raphael and Mauricio’s New Libras Dictionary documents a corpus of 10,400 Libras signs. The present analysis revealed Libras DNA structure by mapping the incidence of 501 sign phonemes resulting from the layered distribution of five parameters: 163 handshape phonemes (CherEmes-ManusIculi); 34 finger shape phonemes (DactilEmes-DigitumIculi); 55 hand placement phonemes (ArtrotoToposEmes-ArticulatiLocusIculi); 173 movement dimension phonemes (CinesEmes-MotusIculi) pertaining to direction, frequency, and type; and 76 Facial Expression phonemes (MascarEmes-PersonalIculi).

Keywords: Brazilian sign language, lexical retrieval, libras sign, sign phonology

Procedia PDF Downloads 312
223 Strategies for Public Space Utilization

Authors: Ben Levenger

Abstract:

Social life revolves around a central meeting place or gathering space. It is where the community integrates, earns social skills, and ultimately becomes part of the community. Following this premise, public spaces are one of the most important spaces that downtowns offer, providing locations for people to be witnessed, heard, and most importantly, seamlessly integrate into the downtown as part of the community. To facilitate this, these local spaces must be envisioned and designed to meet the changing needs of a downtown, offering a space and purpose for everyone. This paper will dive deep into analyzing, designing, and implementing public space design for small plazas or gathering spaces. These spaces often require a detailed level of study, followed by a broad stroke of design implementation, allowing for adaptability. This paper will highlight how to assess needs, define needed types of spaces, outline a program for spaces, detail elements of design to meet the needs, assess your new space, and plan for change. This study will provide participants with the necessary framework for conducting a grass-roots-level assessment of public space and programming, including short-term and long-term improvements. Participants will also receive assessment tools, sheets, and visual representation diagrams. Urbanism, for the sake of urbanism, is an exercise in aesthetic beauty. An economic improvement or benefit must be attained to solidify these efforts' purpose further and justify the infrastructure or construction costs. We will deep dive into case studies highlighting economic impacts to ground this work in quantitative impacts. These case studies will highlight the financial impact on an area, measuring the following metrics: rental rates (per sq meter), tax revenue generation (sales and property), foot traffic generation, increased property valuations, currency expenditure by tenure, clustered development improvements, cost/valuation benefits of increased density in housing. The economic impact results will be targeted by community size, measuring in three tiers: Sub 10,000 in population, 10,001 to 75,000 in population, and 75,000+ in population. Through this classification breakdown, the participants can gauge the impact in communities similar to their work or for which they are responsible. Finally, a detailed analysis of specific urbanism enhancements, such as plazas, on-street dining, pedestrian malls, etc., will be discussed. Metrics that document the economic impact of each enhancement will be presented, aiding in the prioritization of improvements for each community. All materials, documents, and information will be available to participants via Google Drive. They are welcome to download the data and use it for their purposes.

Keywords: downtown, economic development, planning, strategic

Procedia PDF Downloads 50
222 How Holton’s Thematic Analysis Can Help to Understand Why Fred Hoyle Never Accepted Big Bang Cosmology

Authors: Joao Barbosa

Abstract:

After an intense dispute between the big bang cosmology and its big rival, the steady-state cosmology, some important experimental observations, such as the determination of helium abundance in the universe and the discovery of the cosmic background radiation in the 1960s were decisive for the progressive and wide acceptance of big bang cosmology and the inevitable abandonment of steady-state cosmology. But, despite solid theoretical support and those solid experimental observations favorable to big bang cosmology, Fred Hoyle, one of the proponents of the steady-state and the main opponent of the idea of the big bang (which, paradoxically, himself he baptized), never gave up and continued to fight for the idea of a stationary (or quasi-stationary) universe until the end of his life, even after decades of widespread consensus around the big bang cosmology. We can try to understand this persistent attitude of Hoyle by applying Holton’s thematic analysis to cosmology. Holton recognizes in the scientific activity a dimension that, even unconscious or not assumed, is nevertheless very important in the work of scientists, in implicit articulation with the experimental and the theoretical dimensions of science. This is the thematic dimension, constituted by themata – concepts, methodologies, and hypotheses with a metaphysical, aesthetic, logical, or epistemological nature, associated both with the cultural context and the individual psychology of scientists. In practice, themata can be expressed through personal preferences and choices that guide the individual and collective work of scientists. Thematic analysis shows that big bang cosmology is mainly based on a set of themata consisting of evolution, finitude, life cycle, and change; the cosmology of the steady-state is based on opposite themata: steady-state, infinity, continuous existence, and constancy. The passionate controversy that these cosmological views carried out is part of an old cosmological opposition: the thematic opposition between an evolutionary view of the world (associated with Heraclitus) and a stationary view (associated with Parmenides). Personal preferences seem to have been important in this (thematic) controversy, and the thematic analysis that was developed shows that Hoyle is a very illustrative example of a life-long personal commitment to some themata, in this case to the opposite themata of the big bang cosmology. His struggle against the big bang idea was strongly based on philosophical and even religious reasons – which, in a certain sense and in a Holtonian perspective, is related to thematic preferences. In this personal and persistent struggle, Hoyle always refused the way how some experimental observations were considered decisive in favor of the big bang idea, arguing that the success of this idea is based on sociological and cultural prejudices. This Hoyle’s attitude is a personal thematic attitude, in which the acceptance or rejection of what is presented as proof or scientific fact is conditioned by themata: what is a proof or a scientific fact for one scientist is something yet to be established for another scientist who defends different or even opposites themata.

Keywords: cosmology, experimental observations, fred hoyle, interpretation, life-long personal commitment, Themata

Procedia PDF Downloads 138
221 A Gene Selection Algorithm for Microarray Cancer Classification Using an Improved Particle Swarm Optimization

Authors: Arfan Ali Nagra, Tariq Shahzad, Meshal Alharbi, Khalid Masood Khan, Muhammad Mugees Asif, Taher M. Ghazal, Khmaies Ouahada

Abstract:

Gene selection is an essential step for the classification of microarray cancer data. Gene expression cancer data (DNA microarray) facilitates computing the robust and concurrent expression of various genes. Particle swarm optimization (PSO) requires simple operators and less number of parameters for tuning the model in gene selection. The selection of a prognostic gene with small redundancy is a great challenge for the researcher as there are a few complications in PSO based selection method. In this research, a new variant of PSO (Self-inertia weight adaptive PSO) has been proposed. In the proposed algorithm, SIW-APSO-ELM is explored to achieve gene selection prediction accuracies. This new algorithm balances the exploration capabilities of the improved inertia weight adaptive particle swarm optimization and the exploitation. The self-inertia weight adaptive particle swarm optimization (SIW-APSO) is used to search the solution. The SIW-APSO is updated with an evolutionary process in such a way that each particle iteratively improves its velocities and positions. The extreme learning machine (ELM) has been designed for the selection procedure. The proposed method has been to identify a number of genes in the cancer dataset. The classification algorithm contains ELM, K- centroid nearest neighbor (KCNN), and support vector machine (SVM) to attain high forecast accuracy as compared to the start-of-the-art methods on microarray cancer datasets that show the effectiveness of the proposed method.

Keywords: microarray cancer, improved PSO, ELM, SVM, evolutionary algorithms

Procedia PDF Downloads 58
220 Transforming Healthcare with Immersive Visualization: An Analysis of Virtual and Holographic Health Information Platforms

Authors: Hossein Miri, Zhou YongQi, Chan Bormei-Suy

Abstract:

The development of advanced technologies and innovative solutions has opened up exciting new possibilities for revolutionizing healthcare systems. One such emerging concept is the use of virtual and holographic health information platforms that aim to provide interactive and personalized medical information to users. This paper provides a review of notable virtual and holographic health information platforms. It begins by highlighting the need for information visualization and 3D representation in healthcare. It then proceeds to provide background knowledge on information visualization and historical developments in 3D visualization technology. Additional domain knowledge concerning holography, holographic computing, and mixed reality is then introduced, followed by highlighting some of their common applications and use cases. After setting the scene and defining the context, the need and importance of virtual and holographic visualization in medicine are discussed. Subsequently, some of the current research areas and applications of digital holography and holographic technology are explored, alongside the importance and role of virtual and holographic visualization in genetics and genomics. An analysis of the key principles and concepts underlying virtual and holographic health information systems is presented, as well as their potential implications for healthcare are pointed out. The paper concludes by examining the most notable existing mixed-reality applications and systems that help doctors visualize diagnostic and genetic data and assist in patient education and communication. This paper is intended to be a valuable resource for researchers, developers, and healthcare professionals who are interested in the use of virtual and holographic technologies to improve healthcare.

Keywords: virtual, holographic, health information platform, personalized interactive medical information

Procedia PDF Downloads 55
219 Development of an Integrated Route Information Management Software

Authors: Oluibukun G. Ajayi, Joseph O. Odumosu, Oladimeji T. Babafemi, Azeez Z. Opeyemi, Asaleye O. Samuel

Abstract:

The need for the complete automation of every procedure of surveying and most especially, its engineering applications cannot be overemphasized due to the many demerits of the conventional manual or analogue approach. This paper presents the summarized details of the development of a Route Information Management (RIM) software. The software, codenamed ‘AutoROUTE’, was encoded using Microsoft visual studio-visual basic package, and it offers complete automation of the computational procedures and plan production involved in route surveying. It was experimented using a route survey data (longitudinal profile and cross sections) of a 2.7 km road which stretches from Dama to Lunko village in Minna, Niger State, acquired with the aid of a Hi-Target DGPS receiver. The developed software (AutoROUTE) is capable of computing the various simple curve parameters, horizontal curve, and vertical curve, and it can also plot road alignment, longitudinal profile, and cross-section with a capability to store this on the SQL incorporated into the Microsoft visual basic software. The plotted plans with AutoROUTE were compared with the plans produced with the conventional AutoCAD Civil 3D software, and AutoROUTE proved to be more user-friendly and accurate because it plots in three decimal places whereas AutoCAD plots in two decimal places. Also, it was discovered that AutoROUTE software is faster in plotting and the stages involved is less cumbersome compared to AutoCAD Civil 3D software.

Keywords: automated systems, cross sections, curves, engineering construction, longitudinal profile, route surveying

Procedia PDF Downloads 108
218 Automatic Staging and Subtype Determination for Non-Small Cell Lung Carcinoma Using PET Image Texture Analysis

Authors: Seyhan Karaçavuş, Bülent Yılmaz, Ömer Kayaaltı, Semra İçer, Arzu Taşdemir, Oğuzhan Ayyıldız, Kübra Eset, Eser Kaya

Abstract:

In this study, our goal was to perform tumor staging and subtype determination automatically using different texture analysis approaches for a very common cancer type, i.e., non-small cell lung carcinoma (NSCLC). Especially, we introduced a texture analysis approach, called Law’s texture filter, to be used in this context for the first time. The 18F-FDG PET images of 42 patients with NSCLC were evaluated. The number of patients for each tumor stage, i.e., I-II, III or IV, was 14. The patients had ~45% adenocarcinoma (ADC) and ~55% squamous cell carcinoma (SqCCs). MATLAB technical computing language was employed in the extraction of 51 features by using first order statistics (FOS), gray-level co-occurrence matrix (GLCM), gray-level run-length matrix (GLRLM), and Laws’ texture filters. The feature selection method employed was the sequential forward selection (SFS). Selected textural features were used in the automatic classification by k-nearest neighbors (k-NN) and support vector machines (SVM). In the automatic classification of tumor stage, the accuracy was approximately 59.5% with k-NN classifier (k=3) and 69% with SVM (with one versus one paradigm), using 5 features. In the automatic classification of tumor subtype, the accuracy was around 92.7% with SVM one vs. one. Texture analysis of FDG-PET images might be used, in addition to metabolic parameters as an objective tool to assess tumor histopathological characteristics and in automatic classification of tumor stage and subtype.

Keywords: cancer stage, cancer cell type, non-small cell lung carcinoma, PET, texture analysis

Procedia PDF Downloads 301
217 Continuous FAQ Updating for Service Incident Ticket Resolution

Authors: Kohtaroh Miyamoto

Abstract:

As enterprise computing becomes more and more complex, the costs and technical challenges of IT system maintenance and support are increasing rapidly. One popular approach to managing IT system maintenance is to prepare and use an FAQ (Frequently Asked Questions) system to manage and reuse systems knowledge. Such an FAQ system can help reduce the resolution time for each service incident ticket. However, there is a major problem where over time the knowledge in such FAQs tends to become outdated. Much of the knowledge captured in the FAQ requires periodic updates in response to new insights or new trends in the problems addressed in order to maintain its usefulness for problem resolution. These updates require a systematic approach to define the exact portion of the FAQ and its content. Therefore, we are working on a novel method to hierarchically structure the FAQ and automate the updates of its structure and content. We use structured information and the unstructured text information with the timelines of the information in the service incident tickets. We cluster the tickets by structured category information, by keywords, and by keyword modifiers for the unstructured text information. We also calculate an urgency score based on trends, resolution times, and priorities. We carefully studied the tickets of one of our projects over a 2.5-year time period. After the first 6 months, we started to create FAQs and confirmed they improved the resolution times. We continued observing over the next 2 years to assess the ongoing effectiveness of our method for the automatic FAQ updates. We improved the ratio of tickets covered by the FAQ from 32.3% to 68.9% during this time. Also, the average time reduction of ticket resolution was between 31.6% and 43.9%. Subjective analysis showed more than 75% reported that the FAQ system was useful in reducing ticket resolution times.

Keywords: FAQ system, resolution time, service incident tickets, IT system maintenance

Procedia PDF Downloads 311
216 DLtrace: Toward Understanding and Testing Deep Learning Information Flow in Deep Learning-Based Android Apps

Authors: Jie Zhang, Qianyu Guo, Tieyi Zhang, Zhiyong Feng, Xiaohong Li

Abstract:

With the widespread popularity of mobile devices and the development of artificial intelligence (AI), deep learning (DL) has been extensively applied in Android apps. Compared with traditional Android apps (traditional apps), deep learning based Android apps (DL-based apps) need to use more third-party application programming interfaces (APIs) to complete complex DL inference tasks. However, existing methods (e.g., FlowDroid) for detecting sensitive information leakage in Android apps cannot be directly used to detect DL-based apps as they are difficult to detect third-party APIs. To solve this problem, we design DLtrace; a new static information flow analysis tool that can effectively recognize third-party APIs. With our proposed trace and detection algorithms, DLtrace can also efficiently detect privacy leaks caused by sensitive APIs in DL-based apps. Moreover, using DLtrace, we summarize the non-sequential characteristics of DL inference tasks in DL-based apps and the specific functionalities provided by DL models for such apps. We propose two formal definitions to deal with the common polymorphism and anonymous inner-class problems in the Android static analyzer. We conducted an empirical assessment with DLtrace on 208 popular DL-based apps in the wild and found that 26.0% of the apps suffered from sensitive information leakage. Furthermore, DLtrace has a more robust performance than FlowDroid in detecting and identifying third-party APIs. The experimental results demonstrate that DLtrace expands FlowDroid in understanding DL-based apps and detecting security issues therein.

Keywords: mobile computing, deep learning apps, sensitive information, static analysis

Procedia PDF Downloads 133
215 A New Multi-Target, Multi-Agent Search and Rescue Path Planning Approach

Authors: Jean Berger, Nassirou Lo, Martin Noel

Abstract:

Perfectly suited for natural or man-made emergency and disaster management situations such as flood, earthquakes, tornadoes, or tsunami, multi-target search path planning for a team of rescue agents is known to be computationally hard, and most techniques developed so far come short to successfully estimate optimality gap. A novel mixed-integer linear programming (MIP) formulation is proposed to optimally solve the multi-target multi-agent discrete search and rescue (SAR) path planning problem. Aimed at maximizing cumulative probability of successful target detection, it captures anticipated feedback information associated with possible observation outcomes resulting from projected path execution, while modeling agent discrete actions over all possible moving directions. Problem modeling further takes advantage of network representation to encompass decision variables, expedite compact constraint specification, and lead to substantial problem-solving speed-up. The proposed MIP approach uses CPLEX optimization machinery, efficiently computing near-optimal solutions for practical size problems, while giving a robust upper bound obtained from Lagrangean integrality constraint relaxation. Should eventually a target be positively detected during plan execution, a new problem instance would simply be reformulated from the current state, and then solved over the next decision cycle. A computational experiment shows the feasibility and the value of the proposed approach.

Keywords: search path planning, search and rescue, multi-agent, mixed-integer linear programming, optimization

Procedia PDF Downloads 347
214 DWDM Network Implementation in the Honduran Telecommunications Company "Hondutel"

Authors: Tannia Vindel, Carlos Mejia, Damaris Araujo, Carlos Velasquez, Darlin Trejo

Abstract:

The DWDM (Dense Wavelenght Division Multiplexing) is in constant growth around the world by consumer demand to meet their needs. Since its inception in this operation arises the need for a system which enable us to expand the communication of an entire nation to improve the computing trends of their societies according to their customs and geographical location. The Honduran Company of Telecommunications (HONDUTEL), provides the internet services and data transport technology with a PDH and SDH, which represents in the Republic of Honduras C. A., the option of viability for the consumer in terms of purchase value and its ease of acquisition; but does not have the efficiency in terms of technological advance and represents an obstacle that limits the long-term socio-economic development in comparison with other countries in the region and to be able to establish a competition between telecommunications companies that are engaged in this heading. For that reason we propose to establish a new technological trend implemented in Europe and that is applied in our country that allows us to provide a data transfer in broadband as it is DWDM, in this way we will have a stable service and quality that will allow us to compete in this globalized world, and that must be replaced by one that would provide a better service and which must be in the forefront. Once implemented the DWDM is build upon the existing resources, such as the equipment used, and you will be given life to a new stage providing a business image to the Republic of Honduras C,A, as a nation, to ensure the data transport and broadband internet to a meaningful relationship. Same benefits in the first instance to existing customers and to all the institutions were bidden to these public and private need of such services.

Keywords: demultiplexers, light detectors, multiplexers, optical amplifiers, optical fibers, PDH, SDH

Procedia PDF Downloads 230
213 Comparative Evaluation of Vanishing Interfacial Tension Approach for Minimum Miscibility Pressure Determination

Authors: Waqar Ahmad Butt, Gholamreza Vakili Nezhaad, Ali Soud Al Bemani, Yahya Al Wahaibi

Abstract:

Minimum miscibility pressure (MMP) plays a great role in determining the displacement efficiency of different gas injection processes. Experimental techniques for MMP determination include industrially recommended slim tube, vanishing interfacial tension (VIT) and rising bubble apparatus (RBA). In this paper, MMP measurement study using slim tube and VIT experimental techniques for two different crude oil samples (M and N) both in live and stock tank oil forms is being presented. VIT measured MMP values for both 'M' and 'N' live crude oils were close to slim tube determined MMP values with 6.4 and 5 % deviation respectively. Whereas for both oil samples in stock tank oil form, VIT measured MMP showed a higher unacceptable deviation from slim tube determined MMP. This higher difference appears to be related to high stabilized crude oil heavier fraction and lack of multiple contacts miscibility. None of the different nine deployed crude oil and CO2 MMP computing correlations could result in reliable MMP, close to slim tube determined MMP. Since VIT determined MMP values for both considered live crude oils are in close match with slim tube determined MMP values, it confirms reliable, reproducible, rapid and cheap alternative for live crude oil MMP determination. Whereas VIT MMP determination for stock tank oil case needed further investigation about stabilization / destabilization mechanism of oil heavier ends and multiple contacts miscibility development issues.

Keywords: minimum miscibility pressure, interfacial tension, multiple contacts miscibility, heavier ends

Procedia PDF Downloads 250
212 A Survey of Field Programmable Gate Array-Based Convolutional Neural Network Accelerators

Authors: Wei Zhang

Abstract:

With the rapid development of deep learning, neural network and deep learning algorithms play a significant role in various practical applications. Due to the high accuracy and good performance, Convolutional Neural Networks (CNNs) especially have become a research hot spot in the past few years. However, the size of the networks becomes increasingly large scale due to the demands of the practical applications, which poses a significant challenge to construct a high-performance implementation of deep learning neural networks. Meanwhile, many of these application scenarios also have strict requirements on the performance and low-power consumption of hardware devices. Therefore, it is particularly critical to choose a moderate computing platform for hardware acceleration of CNNs. This article aimed to survey the recent advance in Field Programmable Gate Array (FPGA)-based acceleration of CNNs. Various designs and implementations of the accelerator based on FPGA under different devices and network models are overviewed, and the versions of Graphic Processing Units (GPUs), Application Specific Integrated Circuits (ASICs) and Digital Signal Processors (DSPs) are compared to present our own critical analysis and comments. Finally, we give a discussion on different perspectives of these acceleration and optimization methods on FPGA platforms to further explore the opportunities and challenges for future research. More helpfully, we give a prospect for future development of the FPGA-based accelerator.

Keywords: deep learning, field programmable gate array, FPGA, hardware accelerator, convolutional neural networks, CNN

Procedia PDF Downloads 98
211 Digital Interventions for Older People Experiencing Homelessness (OPEH): A Systematic Scoping Review

Authors: Emily Adams, Eddie Donaghy, David Henderson, Lauren Ng, Caroline Sanders, Rowena Stewart, Maria Wolters, Stewart Mercer

Abstract:

Ongoing review abstract: Older People Experiencing Homelessness (OPEH) can have mental and physical indicators of aging 10–20 years earlier than the general population and experience premature mortality due to age-related chronic conditions. Emerging literature suggests digital interventions could positively impact PEH’s well-being. However, the increased reliance on digital delivery may also perpetuate digital inequalities for socially excluded groups, including PEH. The potential triple disadvantage of being older, homeless, and digitally excluded creates a uniquely problematic situation that warrants further research. This scoping review aims to investigate and synthesise the range and type of digital interventions available to OPEH and the organisations that support OPEH. The following databases were searched on 28th July 2023: Medline, Scopus, International Bibliography of the Social Sciences (IBSS)‎, Applied Social Sciences Index & Abstracts (ASSIA)‎, Association for Computing Machinery Digital Library (ACMDL) and Policy commons. A search strategy was developed in collaboration with an academic librarian. The presentation will include: An introduction to OPEH and digital exclusion Overview of the results of this review: OPEH usage of digital platforms Current digital interventions available The role of support organisations Current gaps in the evidence, future research and recommendations for policy and practice

Keywords: homeless, digital exclusion, aging, technology

Procedia PDF Downloads 48
210 Sattriya: Its Transformation as a Principal Medium of Preaching Vaishnava Religion to Performing Art

Authors: Smita Lahkar

Abstract:

Sattriya, the youngest of the eight principal Classical Indian dance traditions, has undergone too many changes and modifications to arrive at its present stage of performing art form extracting itself from age-old religious confinement. Although some of the other traditions have been revived in the recent past, Sattriya has a living tradition since its inception in the 15th century by Srimanta Sankardeva, the great Vaishnavite saint, poet, playwright, lyricist, painter, singer and dancer of Assam, a primary north-eastern state of India. This living dance tradition from the Sattras, the Vaishnavite monasteries, has been practiced for over five hundred years by celibate male monks, as a powerful medium for propagating the Vaishnava religious faith. Sankardeva realised the potential of the vocalised word integrated with the visual image as a powerful medium of expression and communication. So he used this principal medium for propagating his newly found message of devotion among the people of his time. Earlier, Sattriya was performed by male monks alone in monasteries (Sattras) as a part of daily rituals. The females were not even allowed to learn this art form. But, in present time, Sattriya has come out from the Sattras to proscenium stage, performed mostly by female as well as few male dancers also. The technique of performing movements, costumes, ornaments, music and style of performance too have experienced too many changes and modifications. For example, earlier and even today in Sattra, the ‘Pataka’ hand gesture is depicted in conformity with the original context (religious) of creation of the dance form. But, today stage-performers prefer the instructions of the scripture ‘Srihastamuktavali’ and depict the ‘Pataka’ in a sophisticated manner affecting decontextualisation to a certain extent. This adds aesthetic beauty to the dance form as an art distancing it from its context of being a vehicle for propagating Vaishnava religion. The Sattriya dance today stands at the crossroads of past and future, tradition and modernity, devotion and display, spirituality and secularism. The traditional exponents trained under the tutelage of Sattra maestros and imbibing a devotionally inspired rigour of the religion, try to retain the traditional nuances; while the young artists being trained outside the monasteries are more interested in taking up the discipline purely from the perspective of ‘performing arts’ bereft of the philosophy of religion or its sacred associations. Hence, this paper will be an endeavor to establish the hypothesis that the Sattriya, whose origin was for propagating Vaishnava faith, has now entered the world of performing arts with highly aesthetical components. And as a transformed art form, Sattriya may be expected to carve a niche in world dance arena. This will be done with the help of historical evidences, observations from the recorded past and expert rendezvous.

Keywords: dance, performing art, religion, Sattriya

Procedia PDF Downloads 191
209 Existential Affordances and Psychopathology: A Gibsonian Analysis of Dissociative Identity Disorder

Authors: S. Alina Wang

Abstract:

A Gibsonian approach is used to understand the existential dimensions of the human ecological niche. Then, this existential-Gibsonian framework is applied to rethinking Hacking’s historical analysis of multiple personality disorder. This research culminates in a generalized account of psychiatric illness from an enactivist lens. In conclusion, reflections on the implications of this account on approaches to psychiatric treatment are mentioned. J.J. Gibson’s theory of affordances centered on affordances of sensorimotor varieties, which guide basic behaviors relative to organisms’ vital needs and physiological capacities (1979). Later theorists, notably Neisser (1988) and Rietveld (2014), expanded on the theory of affordances to account for uniquely human activities relative to the emotional, intersubjective, cultural, and narrative aspects of the human ecological niche. This research shows that these affordances are structured by what Haugeland (1998) calls existential commitments, which draws on Heidegger’s notion of dasein (1927) and Merleau-Ponty’s account of existential freedom (1945). These commitments organize the existential affordances that fill an individual’s environment and guide their thoughts, emotions, and behaviors. This system of a priori existential commitments and a posteriori affordances is called existential enactivism. For humans, affordances do not only elicit motor responses and appear as objects with instrumental significance. Affordances also, and possibly primarily, determine so-called affective and cognitive activities and structure the wide range of kinds (e.g., instrumental, aesthetic, ethical) of significances of objects found in the world. Then existential enactivism is applied to understanding the psychiatric phenomenon of multiple personality disorder (precursor of the current diagnosis of dissociative identity disorder). A reinterpretation of Hacking’s (1998) insights into the history of this particular disorder and his generalizations on the constructed nature of most psychiatric illness is taken on. Enactivist approaches sensitive to existential phenomenology can provide a deeper understanding of these matters. Conceptualizing psychiatric illness as strictly a disorder in the head (whether parsed as a disorder of brain chemicals or meaning-making capacities encoded in psychological modules) is incomplete. Rather, psychiatric illness must also be understood as a disorder in the world, or in the interconnected networks of existential affordances that regulate one’s emotional, intersubjective, and narrative capacities. All of this suggests that an adequate account of psychiatric illness must involve (1) the affordances that are the sources of existential hindrance, (2) the existential commitments structuring these affordances, and (3) the conditions of these existential commitments. Approaches to treatment of psychiatric illness would be more effective by centering on the interruption of normalized behaviors corresponding to affordances targeted as sources of hindrance, the development of new existential commitments, and the practice of new behaviors that erect affordances relative to these reformed commitments.

Keywords: affordance, enaction, phenomenology, psychiatry, psychopathology

Procedia PDF Downloads 110
208 Chaotic Electronic System with Lambda Diode

Authors: George Mahalu

Abstract:

The Chua diode has been configured over time in various ways, using electronic structures like operational amplifiers (AOs) or devices with gas or semiconductors. When discussing the use of semiconductor devices, tunnel diodes (Esaki diodes) are most often considered, and more recently, transistorized configurations such as lambda diodes. The paperwork proposed here uses in the modeling a lambda diode type configuration consisting of two junction field effect transistors (JFET). The original scheme is created in the MULTISIM electronic simulation environment and is analyzed in order to identify the conditions for the appearance of evolutionary unpredictability specific to nonlinear dynamic systems with chaos-induced behavior. The chaotic deterministic oscillator is one autonomous type, a fact that places it in the class of Chua’s type oscillators, the only significant and most important difference being the presence of a nonlinear device like the one mentioned structure above. The chaotic behavior is identified both by means of strange attractor-type trajectories and visible during the simulation and by highlighting the hypersensitivity of the system to small variations of one of the input parameters. The results obtained through simulation and the conclusions drawn are useful in the further research of ways to implement such constructive electronic solutions in theoretical and practical applications related to modern small signal amplification structures, to systems for encoding and decoding messages through various modern ways of communication, as well as new structures that can be imagined both in modern neural networks and in those for the physical implementation of some requirements imposed by current research with the aim of obtaining practically usable solutions in quantum computing and quantum computers.

Keywords: chua, diode, memristor, chaos

Procedia PDF Downloads 62
207 Effects of Exhaust Gas Emitted by the Fleet on Public Health in the Region of Annaba (Algeria): Ecotoxicological Test on Durum Wheat (Triticum durum Desf.)

Authors: Aouissi Nora, Meksem Leila

Abstract:

This work focused on the study of air pollution generated by the transport sector in the region of Annaba. Our study is based on two parts: the first one concerns an epidemiological investigation in the area of Annaba situated in the east Algerian coast, which deals with the development of the fleet and its impact on public health. To get a more precise idea of the impact of road traffic on public health, we consulted the computing center office of the National Social Insurance Fund. The information we were given by this office refers to the number of reported asthma and heart disease after medical examination during the period 2006-2010. The second part was devoted to the study of the toxicity of exhaust gases on some physical and biochemical parameters of durum wheat (Triticum durum Desf.). After germination and three-leaf stage, the pots are placed in a box of volume (0,096 m3) having an input which is linked directly to the exhaust pipe of a truck, and an outlet to prevent asphyxiation plant. The experience deals with 30 pots: 10 pots are exposed for 5 minutes to exhaust smoke; the other 10 are exposed for 15 minutes, and the remaining 10 for 30 minutes. The epidemiological study shows that the levels of pollutants emitted by the fleet are responsible for the increase of people respiratory and cardiovascular diseases. As for biochemical analyses of vegetation, they clearly show the toxicity of pollutants emitted by the exhaust gases, with an increase in total protein, proline and stimulation of detoxification enzyme (catalase).

Keywords: air pollution, toxicity, epidemiology, biochemistry

Procedia PDF Downloads 309
206 Labour Productivity Measurement and Control Standards for Hotels

Authors: Kristine Joy Simpao

Abstract:

Improving labour productivity is one of the most enthralling and challenging aspects of managing hotels and restaurant business. The demand to secure countless productivity became an increasingly pivotal role of managers to survive and sustain the business. Besides making business profitable, they are in the doom to make every resource to become productive and effective towards achieving company goal while maximizing the value of organization. This paper examines what productivity means to the services industry, in particular, to the hotel industry. This is underpinned by an investigation of the extent of practice of respondent hotels to the labour productivity aspect in the areas of materials management, human resource management and leadership management and in a way, computing the labour productivity ratios using the hotel simple ratios of productivity in order to find a suitable measurement and control standards for hotels with SBMA, Olongapo City as the locale of the study. The finding shows that hotels labour productivity ratings are not perfect with some practices that are far below particularly on strategic and operational decisions in improving performance and productivity of its human resources. It further proves of the no significant difference ratings among the respondent’s type in all areas which indicated that they are having similar perception of the weak implementation of some of the indicators in the labour productivity practices. Furthermore, the results in the computation of labour productivity efficiency ratios resulted relationship of employees versus labour productivity practices are inversely proportional. This study provides a potential measurement and control standards for the enhancement of hotels labour productivity. These standards should also contain labour productivity customized for standard hotels in Subic Bay Freeport Zone to assist hotel owners in increasing the labour productivity while meeting company goals and objectives effectively.

Keywords: labour productivity, hotel, measurement and control, standards, efficiency ratios, practices

Procedia PDF Downloads 292
205 Television Sports Exposure and Rape Myth Acceptance: The Mediating Role of Sexual Objectification of Women

Authors: Sofia Mariani, Irene Leo

Abstract:

The objective of the present study is to define the mediating role of attitudes that objectify and devalue women (hostile sexism, benevolent sexism, and sexual objectification of women) in the indirect correlation between exposure to televised sports and acceptance of rape myths. A second goal is to contribute to research on the topic by defining the role of mediators in exposure to different types of sports, following the traditional gender classification of sports. Data collection was carried out by means of an online questionnaire, measuring television sport exposure, sport type, hostile sexism, benevolent sexism, and sexual objectification of women. Data analysis was carried out using IBM SPSS software. The model used was created using Ordinary Least Squares (OLS) regression path analysis. The predictor variable in the model was television sports exposure, the outcome was rape myths acceptance, and the mediators were (1) hostile sexism, (2) benevolent sexism, and (3) sexual objectification of women. Correlation analyses were carried out dividing by sport type and controlling for the participants’ gender. As seen in existing literature, television sports exposure was found to be indirectly and positively related to rape myth acceptance through the mediating role of: (1) hostile sexism, (2) benevolent sexism, and (3) sexual objectification of women. The type of sport watched influenced the role of the mediators: hostile sexism was found to be the common mediator to all sports type, exposure to traditionally considered feminine or neutral sports showed the additional mediation effect of sexual objectification of women. In line with existing literature, controlling for gender showed that the only significant mediators were hostile sexism for male participants and benevolent sexism for female participants. Given the prevalence of men among the viewers of traditionally considered masculine sports, the correlation between television sports exposure and rape myth acceptance through the mediation of hostile sexism is likely due to the gender of the participants. However, this does not apply to the viewers of traditionally considered feminine and neutral sports, as this group is balanced in terms of gender and shows a unique mediation: the correlation between television sports exposure and rape myth acceptance is mediated by both hostile sexism and sexual objectification. Given that hostile sexism is defined as hostility towards women who oppose or fail to conform to traditional gender roles, these findings confirm that sport is perceived as a non-traditional activity for women. Additionally, these results imply that the portrayal of women in traditionally considered feminine and neutral sports - which are defined as such because of their aesthetic characteristics - may have a strong component of sexual objectification of women. The present research contributes to defining the association between sports exposure and rape myth acceptance through the mediation effects of sexist attitudes and sexual objectification of women. The results of this study have practical implications, such as supporting the feminine sports teams who ask for more practical and less revealing uniforms, more similar to their male colleagues and therefore less objectifying.

Keywords: television exposure, sport, rape myths, objectification, sexism

Procedia PDF Downloads 69
204 Application of 3-6 Years Old Children Basketball Appropriate Forms of Teaching Auxiliary Equipment in Early Childhood Basketball Game

Authors: Hai Zeng, Anqing Liu, Shuguang Dan, Ying Zhang, Yan Li, Zihang Zeng

Abstract:

Children are strong; the country strong, the development of children Basketball is a strategic advantage. Common forms of basketball equipment has been difficult to meet the needs of young children teaching the game of basketball, basketball development for 3-6 years old children in the form of appropriate teaching aids is a breakthrough basketball game teaching children bottlenecks, improve teaching critical path pleasure, but also the development of early childhood basketball a necessary requirement. In this study, literature, questionnaires, focus group interviews, comparative analysis, for domestic and foreign use of 12 kinds of basketball teaching aids (cloud computing MINI basketball, adjustable basketball MINI, MINI basketball court, shooting assist paw print ball, dribble goggles, dribbling machine, machine cartoon shooting, rebounding machine, against the mat, elastic belt, ladder, fitness ball), from fun and improve early childhood shooting technique, dribbling technology, as well as offensive and defensive rebounding against technology conduct research on conversion technology. The results show that by using appropriate forms of teaching children basketball aids, can effectively improve children's fun basketball game, targeted to improve a technology, different types of aids from different perspectives enrich the connotation of children basketball game. Recommended for children of color psychology, cartoon and environmentally friendly material production aids, and increase research efforts basketball aids children, encourage children to sports teachers aids applications.

Keywords: appropriate forms of children basketball, auxiliary equipment, appli, MINI basketball, 3-6 years old children, teaching

Procedia PDF Downloads 358
203 AER Model: An Integrated Artificial Society Modeling Method for Cloud Manufacturing Service Economic System

Authors: Deyu Zhou, Xiao Xue, Lizhen Cui

Abstract:

With the increasing collaboration among various services and the growing complexity of user demands, there are more and more factors affecting the stable development of the cloud manufacturing service economic system (CMSE). This poses new challenges to the evolution analysis of the CMSE. Many researchers have modeled and analyzed the evolution process of CMSE from the perspectives of individual learning and internal factors influencing the system, but without considering other important characteristics of the system's individuals (such as heterogeneity, bounded rationality, etc.) and the impact of external environmental factors. Therefore, this paper proposes an integrated artificial social model for the cloud manufacturing service economic system, which considers both the characteristics of the system's individuals and the internal and external influencing factors of the system. The model consists of three parts: the Agent model, environment model, and rules model (Agent-Environment-Rules, AER): (1) the Agent model considers important features of the individuals, such as heterogeneity and bounded rationality, based on the adaptive behavior mechanisms of perception, action, and decision-making; (2) the environment model describes the activity space of the individuals (real or virtual environment); (3) the rules model, as the driving force of system evolution, describes the mechanism of the entire system's operation and evolution. Finally, this paper verifies the effectiveness of the AER model through computational and experimental results.

Keywords: cloud manufacturing service economic system (CMSE), AER model, artificial social modeling, integrated framework, computing experiment, agent-based modeling, social networks

Procedia PDF Downloads 54
202 Reconstruction of Visual Stimuli Using Stable Diffusion with Text Conditioning

Authors: ShyamKrishna Kirithivasan, Shreyas Battula, Aditi Soori, Richa Ramesh, Ramamoorthy Srinath

Abstract:

The human brain, among the most complex and mysterious aspects of the body, harbors vast potential for extensive exploration. Unraveling these enigmas, especially within neural perception and cognition, delves into the realm of neural decoding. Harnessing advancements in generative AI, particularly in Visual Computing, seeks to elucidate how the brain comprehends visual stimuli observed by humans. The paper endeavors to reconstruct human-perceived visual stimuli using Functional Magnetic Resonance Imaging (fMRI). This fMRI data is then processed through pre-trained deep-learning models to recreate the stimuli. Introducing a new architecture named LatentNeuroNet, the aim is to achieve the utmost semantic fidelity in stimuli reconstruction. The approach employs a Latent Diffusion Model (LDM) - Stable Diffusion v1.5, emphasizing semantic accuracy and generating superior quality outputs. This addresses the limitations of prior methods, such as GANs, known for poor semantic performance and inherent instability. Text conditioning within the LDM's denoising process is handled by extracting text from the brain's ventral visual cortex region. This extracted text undergoes processing through a Bootstrapping Language-Image Pre-training (BLIP) encoder before it is injected into the denoising process. In conclusion, a successful architecture is developed that reconstructs the visual stimuli perceived and finally, this research provides us with enough evidence to identify the most influential regions of the brain responsible for cognition and perception.

Keywords: BLIP, fMRI, latent diffusion model, neural perception.

Procedia PDF Downloads 47
201 Design and Implementation of Control System in Underwater Glider of Ganeshblue

Authors: Imam Taufiqurrahman, Anugrah Adiwilaga, Egi Hidayat, Bambang Riyanto Trilaksono

Abstract:

Autonomous Underwater Vehicle glider is one of the renewal of underwater vehicles. This vehicle is one of the autonomous underwater vehicles that are being developed in Indonesia. Glide ability is obtained by controlling the buoyancy and attitude of the vehicle using the movers within the vehicle. The glider motion mechanism is expected to provide energy resistance from autonomous underwater vehicles so as to increase the cruising range of rides while performing missions. The control system on the vehicle consists of three parts: controlling the attitude of the pitch, the buoyancy engine controller and the yaw controller. The buoyancy and pitch controls on the vehicle are sequentially referring to the finite state machine with pitch angle and depth of diving inputs to obtain a gliding cycle. While the yaw control is done through the rudder for the needs of the guide system. This research is focused on design and implementation of control system of Autonomous Underwater Vehicle glider based on PID anti-windup. The control system is implemented on an ARM TS-7250-V2 device along with a mathematical model of the vehicle in MATLAB using the hardware-in-the-loop simulation (HILS) method. The TS-7250-V2 is chosen because it complies industry standards, has high computing capability, minimal power consumption. The results show that the control system in HILS process can form glide cycle with depth and angle of operation as desired. In the implementation using half control and full control mode, from the experiment can be concluded in full control mode more precision when tracking the reference. While half control mode is considered more efficient in carrying out the mission.

Keywords: control system, PID, underwater glider, marine robotics

Procedia PDF Downloads 349
200 Design of Robust and Intelligent Controller for Active Removal of Space Debris

Authors: Shabadini Sampath, Jinglang Feng

Abstract:

With huge kinetic energy, space debris poses a major threat to astronauts’ space activities and spacecraft in orbit if a collision happens. The active removal of space debris is required in order to avoid frequent collisions that would occur. In addition, the amount of space debris will increase uncontrollably, posing a threat to the safety of the entire space system. But the safe and reliable removal of large-scale space debris has been a huge challenge to date. While capturing and deorbiting space debris, the space manipulator has to achieve high control precision. However, due to uncertainties and unknown disturbances, there is difficulty in coordinating the control of the space manipulator. To address this challenge, this paper focuses on developing a robust and intelligent control algorithm that controls joint movement and restricts it on the sliding manifold by reducing uncertainties. A neural network adaptive sliding mode controller (NNASMC) is applied with the objective of finding the control law such that the joint motions of the space manipulator follow the given trajectory. A computed torque control (CTC) is an effective motion control strategy that is used in this paper for computing space manipulator arm torque to generate the required motion. Based on the Lyapunov stability theorem, the proposed intelligent controller NNASMC and CTC guarantees the robustness and global asymptotic stability of the closed-loop control system. Finally, the controllers used in the paper are modeled and simulated using MATLAB Simulink. The results are presented to prove the effectiveness of the proposed controller approach.

Keywords: GNC, active removal of space debris, AI controllers, MatLabSimulink

Procedia PDF Downloads 99
199 The Cloud Systems Used in Education: Properties and Overview

Authors: Agah Tuğrul Korucu, Handan Atun

Abstract:

Diversity and usefulness of information that used in education are have increased due to development of technology. Web technologies have made enormous contributions to the distance learning system especially. Mobile systems, one of the most widely used technology in distance education, made much easier to access web technologies. Not bounding by space and time, individuals have had the opportunity to access the information on web. In addition to this, the storage of educational information and resources and accessing these information and resources is crucial for both students and teachers. Because of this importance, development and dissemination of web technologies supply ease of access to information and resources are provided by web technologies. Dynamic web technologies introduced as new technologies that enable sharing and reuse of information, resource or applications via the Internet and bring websites into expandable platforms are commonly known as Web 2.0 technologies. Cloud systems are one of the dynamic web technologies that defined as a model provides approaching the demanded information independent from time and space in appropriate circumstances and developed by NIST. One of the most important advantages of cloud systems is meeting the requirements of users directly on the web regardless of hardware, software, and dealing with install. Hence, this study aims at using cloud services in education and investigating the services provided by the cloud computing. Survey method has been used as research method. In the findings of this research the fact that cloud systems are used such studies as resource sharing, collaborative work, assignment submission and feedback, developing project in the field of education, and also, it is revealed that cloud systems have plenty of significant advantages in terms of facilitating teaching activities and the interaction between teacher, student and environment.

Keywords: cloud systems, cloud systems in education, online learning environment, integration of information technologies, e-learning, distance learning

Procedia PDF Downloads 322