Search results for: soft computing techniques
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 8429

Search results for: soft computing techniques

8129 The Curvature of Bending Analysis and Motion of Soft Robotic Fingers by Full 3D Printing with MC-Cells Technique for Hand Rehabilitation

Authors: Chaiyawat Musikapan, Ratchatin Chancharoen, Saknan Bongsebandhu-Phubhakdi

Abstract:

For many recent years, soft robotic fingers were used for supporting the patients who had survived the neurological diseases that resulted in muscular disorders and neural network damages, such as stroke and Parkinson’s disease, and inflammatory symptoms such as De Quervain and trigger finger. Generally, the major hand function is significant to manipulate objects in activities of daily living (ADL). In this work, we proposed the model of soft actuator that manufactured by full 3D printing without the molding process and one material for use. Furthermore, we designed the model with a technique of multi cavitation cells (MC-Cells). Then, we demonstrated the curvature bending, fluidic pressure and force that generated to the model for assistive finger flexor and hand grasping. Also, the soft actuators were characterized in mathematics solving by the length of chord and arc length. In addition, we used an adaptive push-button switch machine to measure the force in our experiment. Consequently, we evaluated biomechanics efficiency by the range of motion (ROM) that affected to metacarpophalangeal joint (MCP), proximal interphalangeal joint (PIP) and distal interphalangeal joint (DIP). Finally, the model achieved to exhibit the corresponding fluidic pressure with force and ROM to assist the finger flexor and hand grasping.

Keywords: biomechanics efficiency, curvature bending, hand functional assistance, multi cavitation cells (MC-Cells), range of motion (ROM)

Procedia PDF Downloads 260
8128 Identification and Characterization of Enterobacter cloacae, New Soft Rot Causing Pathogen of Radish in India

Authors: B. S. Chandrashekar, M. K. Prasannakumar, P. Buela Parivallal, Sahana N. Banakar, Swathi S. Patil, H. B. Mahesh, D. Pramesh

Abstract:

Bacterial soft rot is one of the most often seen diseases in many plant species globally, resulting in considerable yield loss. Radish roots with dark water-soaked lesions, maceration of tissue, and a foul odour were collected in the Kolar region, India. Two isolates were obtained from rotted samples that demonstrated morphologically unpigmented, white mucoid convex colonies on nutrient agar medium. The isolated bacteria (RDH1 and RDH3) were gram-negative, rod-shaped bacteria with biochemically distinct characteristics similar to the type culture of Enterobacter cloacae ATCC13047 and Bergy's handbook of determinative bacteriology. The 16s rRNA gene was used to identify Enterobacter species. On carrot, potato, tomato, chilli, bell pepper, knolkhol, cauliflower, cabbage, and cucumber slices, the Koch′s postulates were fulfilled, and the pathogen was also pathogenic on radish, cauliflower, and cabbage seedlings were grown in a glasshouse. After 36 hours, both isolates exhibited a hypersensitive sensitivity to Nicotianatabacum. Semi-quantitative analysis revealed that cell wall degrading enzymes (CWDEs) such as pectin lyase, polygalacturonase, and cellulase (p=1.4e09) contributed to pathogenicity, whereas isolates produced biofilms (p=4.3e-11) that help in host adhesion. This is the first report in India of radish soft rot caused by E. cloacae.

Keywords: soft rot, enterobacter cloacae, 16S rRNA, nicotiana tabacum, and pathogenicity

Procedia PDF Downloads 121
8127 Geotechnical Evaluation and Sizing of the Reinforcement Layer on Soft Soil in the Construction of the North Triage Road Clover, in Brasilia Federal District, Brazil

Authors: Rideci Farias, Haroldo Paranhos, Joyce Silva, Elson Almeida, Hellen Silva, Lucas Silva

Abstract:

The constant growth of the fleet of vehicles in the big cities, makes that the Engineering is dynamic, with respect to the new solutions for traffic flow in general. In the Federal District (DF), Brazil, it is no different. The city of Brasilia, Capital of Brazil, and Cultural Heritage of Humanity by UNESCO, is projected to 500 thousand inhabitants, and today circulates more than 3 million people in the city, and with a fleet of more than one vehicle for every two inhabitants. The growth of the city to the North region, made that the urban planning presented solutions for the fleet in constant growth. In this context, a complex of viaducts, road accesses, creation of new rolling roads and duplication of the Bragueto bridge over Paranoa lake in the northern part of the city was designed, giving access to the BR-020 highway, denominated Clover of North Triage (TTN). In the geopedological context, the region is composed of hydromorphic soils, with the presence of the water level at some times of the year. From the geotechnical point of view, are soils with SPT < 4 and Resistance not drained, Su < 50 kPa. According to urban planning in Brasília, special art works can not rise in the urban landscape, contrasting with the urban characteristics of the architects Lúcio Costa and Oscar Niemeyer. Architects hired to design the new Capital of Brazil. The urban criterion then created the technical impasse, resulting in the technical need to ‘bury’ the works of art and in turn the access greenhouses at different levels, in regions of low support soil and water level Outcrossing, generally inducing the need for this study and design. For the adoption of the appropriate solution, Standard Penetration Test (SPT), Vane Test, Diagnostic peritoneal lavage (DPL) and auger boring campaigns were carried out. With the comparison of the results of these tests, the profiles of resistance of the soils and water levels were created in the studied sections. Geometric factors such as existing sidewalks and lack of elevation for the discharge of deep drainage water have inhibited traditional techniques for total removal of soft soils, thus avoiding the use of temporary drawdown and shoring of excavations. Thus, a structural layer was designed to reinforce the subgrade by means of the ‘needling’ of the soft soil, without the need for longitudinal drains. In this context, the article presents the geological and geotechnical studies carried out, but also the dimensioning of the reinforcement layer on the soft soil with a view to the main objective of this solution that is to allow the execution of the civil works without the interference in the roads in use, Execution of services in rainy periods, presentation of solution compatible with drainage characteristics and soft soil reinforcement.

Keywords: layer, reinforcement, soft soil, clover of north triage

Procedia PDF Downloads 226
8126 A New Prediction Model for Soil Compression Index

Authors: D. Mohammadzadeh S., J. Bolouri Bazaz

Abstract:

This paper presents a new prediction model for compression index of fine-grained soils using multi-gene genetic programming (MGGP) technique. The proposed model relates the soil compression index to its liquid limit, plastic limit and void ratio. Several laboratory test results for fine-grained were used to develop the models. Various criteria were considered to check the validity of the model. The parametric and sensitivity analyses were performed and discussed. The MGGP method was found to be very effective for predicting the soil compression index. A comparative study was further performed to prove the superiority of the MGGP model to the existing soft computing and traditional empirical equations.

Keywords: new prediction model, compression index soil, multi-gene genetic programming, MGGP

Procedia PDF Downloads 375
8125 Going Horizontal: Confronting the Challenges When Transitioning to Cloud

Authors: Harvey Hyman, Thomas Hull

Abstract:

As one of the largest cancer treatment centers in the United States, we continuously confront the challenge of how to leverage the best possible technological solutions, in order to provide the highest quality of service to our customers – the doctors, nurses and patients at Moffitt who are fighting every day for the prevention and cure of cancer. This paper reports on the transition from a vertical to a horizontal IT infrastructure. We discuss how the new frameworks and methods such as public, private and hybrid cloud, brokering cloud services are replacing the traditional vertical paradigm for computing. We also report on the impact of containers, micro services, and the shift to continuous integration/continuous delivery. These impacts and changes in delivery methodology for computing are driving how we accomplish our strategic IT goals across the enterprise.

Keywords: cloud computing, IT infrastructure, IT architecture, healthcare

Procedia PDF Downloads 380
8124 Estimation of Consolidating Settlement Based on a Time-Dependent Skin Friction Model Considering Column Surface Roughness

Authors: Jiang Zhenbo, Ishikura Ryohei, Yasufuku Noriyuki

Abstract:

Improvement of soft clay deposits by the combination of surface stabilization and floating type cement-treated columns is one of the most popular techniques worldwide. On the basis of one dimensional consolidation model, a time-dependent skin friction model for the column-soil interaction is proposed. The nonlinear relationship between column shaft shear stresses and effective vertical pressure of the surrounding soil can be described in this model. The influence of column-soil surface roughness can be represented using a roughness coefficient R, which plays an important role in the design of column length. Based on the homogenization method, a part of floating type improved ground will be treated as an unimproved portion, which with a length of αH1 is defined as a time-dependent equivalent skin friction length. The compression settlement of this unimproved portion can be predicted only using the soft clay parameters. Apart from calculating the settlement of this composited ground, the load transfer mechanism is discussed utilizing model tests. The proposed model is validated by comparing with calculations and laboratory results of model and ring shear tests, which indicate the suitability and accuracy of the solutions in this paper.

Keywords: floating type improved foundation, time-dependent skin friction, roughness, consolidation

Procedia PDF Downloads 468
8123 A Study on How to Link BIM Services to Cloud Computing Architecture

Authors: Kim Young-Jin, Kim Byung-Kon

Abstract:

Although more efforts to expand the application of BIM (Building Information Modeling) technologies have be pursued in recent years than ever, it’s true that there have been various challenges in doing so, including a lack or absence of relevant institutions, lots of costs required to build BIM-related infrastructure, incompatible processes, etc. This, in turn, has led to a more prolonged delay in the expansion of their application than expected at an early stage. Especially, attempts to save costs for building BIM-related infrastructure and provide various BIM services compatible with domestic processes include studies to link between BIM and cloud computing technologies. Also in this study, the author attempted to develop a cloud BIM service operation model through analyzing the level of BIM applications for the construction sector and deriving relevant service areas, and find how to link BIM services to the cloud operation model, as through archiving BIM data and creating a revenue structure so that the BIM services may grow spontaneously, considering a demand for cloud resources.

Keywords: construction IT, BIM (building information modeling), cloud computing, BIM service based cloud computing

Procedia PDF Downloads 487
8122 Effective Supply Chain Coordination with Hybrid Demand Forecasting Techniques

Authors: Gurmail Singh

Abstract:

Effective supply chain is the main priority of every organization which is the outcome of strategic corporate investments with deliberate management action. Value-driven supply chain is defined through development, procurement and by configuring the appropriate resources, metrics and processes. However, responsiveness of the supply chain can be improved by proper coordination. So the Bullwhip effect (BWE) and Net stock amplification (NSAmp) values were anticipated and used for the control of inventory in organizations by both discrete wavelet transform-Artificial neural network (DWT-ANN) and Adaptive Network-based fuzzy inference system (ANFIS). This work presents a comparative methodology of forecasting for the customers demand which is non linear in nature for a multilevel supply chain structure using hybrid techniques such as Artificial intelligence techniques including Artificial neural networks (ANN) and Adaptive Network-based fuzzy inference system (ANFIS) and Discrete wavelet theory (DWT). The productiveness of these forecasting models are shown by computing the data from real world problems for Bullwhip effect and Net stock amplification. The results showed that these parameters were comparatively less in case of discrete wavelet transform-Artificial neural network (DWT-ANN) model and using Adaptive network-based fuzzy inference system (ANFIS).

Keywords: bullwhip effect, hybrid techniques, net stock amplification, supply chain flexibility

Procedia PDF Downloads 127
8121 DNA PLA: A Nano-Biotechnological Programmable Device

Authors: Hafiz Md. HasanBabu, Khandaker Mohammad Mohi Uddin, Md. IstiakJaman Ami, Rahat Hossain Faisal

Abstract:

Computing in biomolecular programming performs through the different types of reactions. Proteins and nucleic acids are used to store the information generated by biomolecular programming. DNA (Deoxyribose Nucleic Acid) can be used to build a molecular computing system and operating system for its predictable molecular behavior property. The DNA device has clear advantages over conventional devices when applied to problems that can be divided into separate, non-sequential tasks. The reason is that DNA strands can hold so much data in memory and conduct multiple operations at once, thus solving decomposable problems much faster. Programmable Logic Array, abbreviated as PLA is a programmable device having programmable AND operations and OR operations. In this paper, a DNA PLA is designed by different molecular operations using DNA molecules with the proposed algorithms. The molecular PLA could take advantage of DNA's physical properties to store information and perform calculations. These include extremely dense information storage, enormous parallelism, and extraordinary energy efficiency.

Keywords: biological systems, DNA computing, parallel computing, programmable logic array, PLA, DNA

Procedia PDF Downloads 129
8120 Navigating Cyber Attacks with Quantum Computing: Leveraging Vulnerabilities and Forensics for Advanced Penetration Testing in Cybersecurity

Authors: Sayor Ajfar Aaron, Ashif Newaz, Sajjat Hossain Abir, Mushfiqur Rahman

Abstract:

This paper examines the transformative potential of quantum computing in the field of cybersecurity, with a focus on advanced penetration testing and forensics. It explores how quantum technologies can be leveraged to identify and exploit vulnerabilities more efficiently than traditional methods and how they can enhance the forensic analysis of cyber-attacks. Through theoretical analysis and practical simulations, this study highlights the enhanced capabilities of quantum algorithms in detecting and responding to sophisticated cyber threats, providing a pathway for developing more resilient cybersecurity infrastructures.

Keywords: cybersecurity, cyber forensics, penetration testing, quantum computing

Procedia PDF Downloads 67
8119 Method and Apparatus for Optimized Job Scheduling in the High-Performance Computing Cloud Environment

Authors: Subodh Kumar, Amit Varde

Abstract:

Typical on-premises high-performance computing (HPC) environments consist of a fixed number and a fixed set of computing hardware. During the design of the HPC environment, the hardware components, including but not limited to CPU, Memory, GPU, and networking, are carefully chosen from select vendors for optimal performance. High capital cost for building the environment is a prime factor influencing the design environment. A class of software called “Job Schedulers” are critical to maximizing these resources and running multiple workloads to extract the maximum value for the high capital cost. In principle, schedulers work by preventing workloads and users from monopolizing the finite hardware resources by queuing jobs in a workload. A cloud-based HPC environment does not have the limitations of fixed (type of and quantity of) hardware resources. In theory, users and workloads could spin up any number and type of hardware resource. This paper discusses the limitations of using traditional scheduling algorithms for cloud-based HPC workloads. It proposes a new set of features, called “HPC optimizers,” for maximizing the benefits of the elasticity and scalability of the cloud with the goal of cost-performance optimization of the workload.

Keywords: high performance computing, HPC, cloud computing, optimization, schedulers

Procedia PDF Downloads 93
8118 Crow Search Algorithm-Based Task Offloading Strategies for Fog Computing Architectures

Authors: Aniket Ganvir, Ritarani Sahu, Suchismita Chinara

Abstract:

The rapid digitization of various aspects of life is leading to the creation of smart IoT ecosystems, where interconnected devices generate significant amounts of valuable data. However, these IoT devices face constraints such as limited computational resources and bandwidth. Cloud computing emerges as a solution by offering ample resources for offloading tasks efficiently despite introducing latency issues, especially for time-sensitive applications like fog computing. Fog computing (FC) addresses latency concerns by bringing computation and storage closer to the network edge, minimizing data travel distance, and enhancing efficiency. Offloading tasks to fog nodes or the cloud can conserve energy and extend IoT device lifespan. The offloading process is intricate, with tasks categorized as full or partial, and its optimization presents an NP-hard problem. Traditional greedy search methods struggle to address the complexity of task offloading efficiently. To overcome this, the efficient crow search algorithm (ECSA) has been proposed as a meta-heuristic optimization algorithm. ECSA aims to effectively optimize computation offloading, providing solutions to this challenging problem.

Keywords: IoT, fog computing, task offloading, efficient crow search algorithm

Procedia PDF Downloads 58
8117 Continuous Functions Modeling with Artificial Neural Network: An Improvement Technique to Feed the Input-Output Mapping

Authors: A. Belayadi, A. Mougari, L. Ait-Gougam, F. Mekideche-Chafa

Abstract:

The artificial neural network is one of the interesting techniques that have been advantageously used to deal with modeling problems. In this study, the computing with artificial neural network (CANN) is proposed. The model is applied to modulate the information processing of one-dimensional task. We aim to integrate a new method which is based on a new coding approach of generating the input-output mapping. The latter is based on increasing the neuron unit in the last layer. Accordingly, to show the efficiency of the approach under study, a comparison is made between the proposed method of generating the input-output set and the conventional method. The results illustrated that the increasing of the neuron units, in the last layer, allows to find the optimal network’s parameters that fit with the mapping data. Moreover, it permits to decrease the training time, during the computation process, which avoids the use of computers with high memory usage.

Keywords: neural network computing, continuous functions generating the input-output mapping, decreasing the training time, machines with big memories

Procedia PDF Downloads 283
8116 Voting Representation in Social Networks Using Rough Set Techniques

Authors: Yasser F. Hassan

Abstract:

Social networking involves use of an online platform or website that enables people to communicate, usually for a social purpose, through a variety of services, most of which are web-based and offer opportunities for people to interact over the internet, e.g. via e-mail and ‘instant messaging’, by analyzing the voting behavior and ratings of judges in a popular comments in social networks. While most of the party literature omits the electorate, this paper presents a model where elites and parties are emergent consequences of the behavior and preferences of voters. The research in artificial intelligence and psychology has provided powerful illustrations of the way in which the emergence of intelligent behavior depends on the development of representational structure. As opposed to the classical voting system (one person – one decision – one vote) a new voting system is designed where agents with opposed preferences are endowed with a given number of votes to freely distribute them among some issues. The paper uses ideas from machine learning, artificial intelligence and soft computing to provide a model of the development of voting system response in a simulated agent. The modeled development process involves (simulated) processes of evolution, learning and representation development. The main value of the model is that it provides an illustration of how simple learning processes may lead to the formation of structure. We employ agent-based computer simulation to demonstrate the formation and interaction of coalitions that arise from individual voter preferences. We are interested in coordinating the local behavior of individual agents to provide an appropriate system-level behavior.

Keywords: voting system, rough sets, multi-agent, social networks, emergence, power indices

Procedia PDF Downloads 393
8115 Comprehensive Study of X-Ray Emission by APF Plasma Focus Device

Authors: M. Habibi

Abstract:

The time-resolved studies of soft and hard X-ray were carried out over a wide range of argon pressures by employing an array of eight filtered photo PIN diodes and a scintillation detector, simultaneously. In 50% of the discharges, the soft X-ray is seen to be emitted in short multiple pulses corresponding to different compression, whereas it is a single pulse for hard X-rays corresponding to only the first strong compression. It should be stated that multiple compressions dominantly occur at low pressures and high pressures are mostly in the single compression regime. In 43% of the discharges, at all pressures except for optimum pressure, the first period is characterized by two or more sharp peaks.The X–ray signal intensity during the second and subsequent compressions is much smaller than the first compression.

Keywords: plasma focus device, SXR, HXR, Pin-diode, argon plasma

Procedia PDF Downloads 408
8114 Automatic Number Plate Recognition System Based on Deep Learning

Authors: T. Damak, O. Kriaa, A. Baccar, M. A. Ben Ayed, N. Masmoudi

Abstract:

In the last few years, Automatic Number Plate Recognition (ANPR) systems have become widely used in the safety, the security, and the commercial aspects. Forethought, several methods and techniques are computing to achieve the better levels in terms of accuracy and real time execution. This paper proposed a computer vision algorithm of Number Plate Localization (NPL) and Characters Segmentation (CS). In addition, it proposed an improved method in Optical Character Recognition (OCR) based on Deep Learning (DL) techniques. In order to identify the number of detected plate after NPL and CS steps, the Convolutional Neural Network (CNN) algorithm is proposed. A DL model is developed using four convolution layers, two layers of Maxpooling, and six layers of fully connected. The model was trained by number image database on the Jetson TX2 NVIDIA target. The accuracy result has achieved 95.84%.

Keywords: ANPR, CS, CNN, deep learning, NPL

Procedia PDF Downloads 306
8113 An Approach for Coagulant Dosage Optimization Using Soft Jar Test: A Case Study of Bangkhen Water Treatment Plant

Authors: Ninlawat Phuangchoke, Waraporn Viyanon, Setta Sasananan

Abstract:

The most important process of the water treatment plant process is the coagulation using alum and poly aluminum chloride (PACL), and the value of usage per day is a hundred thousand baht. Therefore, determining the dosage of alum and PACL are the most important factors to be prescribed. Water production is economical and valuable. This research applies an artificial neural network (ANN), which uses the Levenberg–Marquardt algorithm to create a mathematical model (Soft Jar Test) for prediction chemical dose used to coagulation such as alum and PACL, which input data consists of turbidity, pH, alkalinity, conductivity, and, oxygen consumption (OC) of Bangkhen water treatment plant (BKWTP) Metropolitan Waterworks Authority. The data collected from 1 January 2019 to 31 December 2019 cover changing seasons of Thailand. The input data of ANN is divided into three groups training set, test set, and validation set, which the best model performance with a coefficient of determination and mean absolute error of alum are 0.73, 3.18, and PACL is 0.59, 3.21 respectively.

Keywords: soft jar test, jar test, water treatment plant process, artificial neural network

Procedia PDF Downloads 166
8112 Deflection Behaviour of Retaining Wall with Pile for Pipeline on Slope of Soft Soil

Authors: Mutadi

Abstract:

Pipes laying on an unstable slope of soft soil are prone to movement. Pipelines that are buried in unstable slope areas will move due to lateral loads from soil movement, which can cause damage to the pipeline. A small-scale laboratory model of the reinforcement system of piles supported by retaining walls was conducted to investigate the effect of lateral load on the reinforcement. In this experiment, the lateral forces of 0.3 kN, 0.35 kN, and 0.4 kN and vertical force of 0.05 kN, 0.1 kN, and 0.15 kN were used. Lateral load from the electric jack is equipped with load cell and vertical load using the cement-steel box. To validate the experimental result, a finite element program named 2-D Plaxis was used. The experimental results showed that with an increase in lateral loading, the displacement of the reinforcement system increased. For a Vertical Load, 0.1 kN and versus a lateral load of 0.3 kN causes a horizontal displacement of 0.35 mm and an increase of 2.94% for loading of 0.35 kN and an increase of 8.82% for loading 0.4 kN. The pattern is the same in the finite element method analysis, where there was a 6.52% increase for 0.35 kN loading and an increase to 23.91 % for 0.4 kN loading. In the same Load, the Reinforcement System is reliable, as shown in Safety Factor on dry conditions were 3.3, 2.824 and 2.474, and on wet conditions were 2.98, 2.522 and 2.235.

Keywords: soft soil, deflection, wall, pipeline

Procedia PDF Downloads 163
8111 Numerical Static and Seismic Evaluation of Pile Group Settlement: A Case Study

Authors: Seyed Abolhassan Naeini, Hamed Yekehdehghan

Abstract:

Shallow foundations cannot be used when the bedding soil is soft. A suitable method for constructing foundations on soft soil is to employ pile groups to transfer the load to the bottom layers. The present research used results from tests carried out in northern Iran (Langarud) and the FLAC3D software to model a pile group for investigating the effects of various parameters on pile cap settlement under static and seismic conditions. According to the results, changes in the strength parameters of the soil, groundwater level, and the length of and distance between the piles affect settlement differently.

Keywords: FLACD 3D software, pile group, settlement, soil

Procedia PDF Downloads 128
8110 A Distributed Cryptographically Generated Address Computing Algorithm for Secure Neighbor Discovery Protocol in IPv6

Authors: M. Moslehpour, S. Khorsandi

Abstract:

Due to shortage in IPv4 addresses, transition to IPv6 has gained significant momentum in recent years. Like Address Resolution Protocol (ARP) in IPv4, Neighbor Discovery Protocol (NDP) provides some functions like address resolution in IPv6. Besides functionality of NDP, it is vulnerable to some attacks. To mitigate these attacks, Internet Protocol Security (IPsec) was introduced, but it was not efficient due to its limitation. Therefore, SEND protocol is proposed to automatic protection of auto-configuration process. It is secure neighbor discovery and address resolution process. To defend against threats on NDP’s integrity and identity, Cryptographically Generated Address (CGA) and asymmetric cryptography are used by SEND. Besides advantages of SEND, its disadvantages like the computation process of CGA algorithm and sequentially of CGA generation algorithm are considerable. In this paper, we parallel this process between network resources in order to improve it. In addition, we compare the CGA generation time in self-computing and distributed-computing process. We focus on the impact of the malicious nodes on the CGA generation time in the network. According to the result, although malicious nodes participate in the generation process, CGA generation time is less than when it is computed in a one-way. By Trust Management System, detecting and insulating malicious nodes is easier.

Keywords: NDP, IPsec, SEND, CGA, modifier, malicious node, self-computing, distributed-computing

Procedia PDF Downloads 278
8109 Critical Core Skills Profiling in the Singaporean Workforce

Authors: Bi Xiao Fang, Tan Bao Zhen

Abstract:

Soft skills, core competencies, and generic competencies are exchangeable terminologies often used to represent a similar concept. In the Singapore context, such skills are currently being referred to as Critical Core Skills (CCS). In 2019, SkillsFuture Singapore (SSG) reviewed the Generic Skills and Competencies (GSC) framework that was first introduced in 2016, culminating in the development of the Critical Core Skills (CCS) framework comprising 16 soft skills classified into three clusters. The CCS framework is part of the Skills Framework, and whose stated purpose is to create a common skills language for individuals, employers and training providers. It is also developed with the objectives of building deep skills for a lean workforce, enhance business competitiveness and support employment and employability. This further helps to facilitate skills recognition and support the design of training programs for skills and career development. According to SSG, every job role requires a set of technical skills and a set of Critical Core Skills to perform well at work, whereby technical skills refer to skills required to perform key tasks of the job. There has been an increasing emphasis on soft skills for the future of work. A recent study involving approximately 80 organizations across 28 sectors in Singapore revealed that more enterprises are beginning to recognize that soft skills support their employees’ performance and business competitiveness. Though CCS is of high importance for the development of the workforce’s employability, there is little attention paid to the CCS use and profiling across occupations. A better understanding of how CCS is distributed across the economy will thus significantly enhance SSG’s career guidance services as well as training providers’ services to graduates and workers and guide organizations in their hiring for soft skills. This CCS profiling study sought to understand how CCS is demanded in different occupations. To achieve its research objectives, this study adopted a quantitative method to measure CCS use across different occupations in the Singaporean workforce. Based on the CCS framework developed by SSG, the research team adopted a formative approach to developing the CCS profiling tool to measure the importance of and self-efficacy in the use of CCS among the Singaporean workforce. Drawing on the survey results from 2500 participants, this study managed to profile them into seven occupation groups based on the different patterns of importance and confidence levels of the use of CCS. Each occupation group is labeled according to the most salient and demanded CCS. In the meantime, the CCS in each occupation group, which may need some further strengthening, were also identified. The profiling of CCS use has significant implications for different stakeholders, e.g., employers could leverage the profiling results to hire the staff with the soft skills demanded by the job.

Keywords: employability, skills profiling, skills measurement, soft skills

Procedia PDF Downloads 95
8108 Application of Optimization Techniques in Overcurrent Relay Coordination: A Review

Authors: Syed Auon Raza, Tahir Mahmood, Syed Basit Ali Bukhari

Abstract:

In power system properly coordinated protection scheme is designed to make sure that only the faulty part of the system will be isolated when abnormal operating condition of the system will reach. The complexity of the system as well as the increased user demand and the deregulated environment enforce the utilities to improve system reliability by using a properly coordinated protection scheme. This paper presents overview of over current relay coordination techniques. Different techniques such as Deterministic Techniques, Meta Heuristic Optimization techniques, Hybrid Optimization Techniques, and Trial and Error Optimization Techniques have been reviewed in terms of method of their implementation, operation modes, nature of distribution system, and finally their advantages as well as the disadvantages.

Keywords: distribution system, relay coordination, optimization, Plug Setting Multiplier (PSM)

Procedia PDF Downloads 399
8107 Accelerating Side Channel Analysis with Distributed and Parallelized Processing

Authors: Kyunghee Oh, Dooho Choi

Abstract:

Although there is no theoretical weakness in a cryptographic algorithm, Side Channel Analysis can find out some secret data from the physical implementation of a cryptosystem. The analysis is based on extra information such as timing information, power consumption, electromagnetic leaks or even sound which can be exploited to break the system. Differential Power Analysis is one of the most popular analyses, as computing the statistical correlations of the secret keys and power consumptions. It is usually necessary to calculate huge data and takes a long time. It may take several weeks for some devices with countermeasures. We suggest and evaluate the methods to shorten the time to analyze cryptosystems. Our methods include distributed computing and parallelized processing.

Keywords: DPA, distributed computing, parallelized processing, side channel analysis

Procedia PDF Downloads 427
8106 A Study on How to Develop the Usage Metering Functions of BIM (Building Information Modeling) Software under Cloud Computing Environment

Authors: Kim Byung-Kon, Kim Young-Jin

Abstract:

As project opportunities for the Architecture, Engineering and Construction (AEC) industry have grown more complex and larger, the utilization of BIM (Building Information Modeling) technologies for 3D design and simulation practices has been increasing significantly; the typical applications of the BIM technologies include clash detection and design alternative based on 3D planning, which have been expanded over to the technology of construction management in the AEC industry for virtual design and construction. As for now, commercial BIM software has been operated under a single-user environment, which is why initial costs for its introduction are very high. Cloud computing, one of the most promising next-generation Internet technologies, enables simple Internet devices to use services and resources provided with BIM software. Recently in Korea, studies to link between BIM and cloud computing technologies have been directed toward saving costs to build BIM-related infrastructure, and providing various BIM services for small- and medium-sized enterprises (SMEs). This study addressed how to develop the usage metering functions of BIM software under cloud computing architecture in order to archive and use BIM data and create an optimal revenue structure so that the BIM services may grow spontaneously, considering a demand for cloud resources. To this end, the author surveyed relevant cases, and then analyzed needs and requirements from AEC industry. Based on the results & findings of the foregoing survey & analysis, the author proposed herein how to optimally develop the usage metering functions of cloud BIM software.

Keywords: construction IT, BIM (Building Information Modeling), cloud computing, BIM-based cloud computing, 3D design, cloud BIM

Procedia PDF Downloads 506
8105 Development of a Laboratory Laser-Produced Plasma “Water Window” X-Ray Source for Radiobiology Experiments

Authors: Daniel Adjei, Mesfin Getachew Ayele, Przemyslaw Wachulak, Andrzej Bartnik, Luděk Vyšín, Henryk Fiedorowicz, Inam Ul Ahad, Lukasz Wegrzynski, Anna Wiechecka, Janusz Lekki, Wojciech M. Kwiatek

Abstract:

Laser produced plasma light sources, emitting high intensity pulses of X-rays, delivering high doses are useful to understand the mechanisms of high dose effects on biological samples. In this study, a desk-top laser plasma soft X-ray source, developed for radio biology research, is presented. The source is based on a double-stream gas puff target, irradiated with a commercial Nd:YAG laser (EKSPLA), which generates laser pulses of 4 ns time duration and energy up to 800 mJ at 10 Hz repetition rate. The source has been optimized for maximum emission in the “water window” wavelength range from 2.3 nm to 4.4 nm by using pure gas (argon, nitrogen and krypton) and spectral filtering. Results of the source characterization measurements and dosimetry of the produced soft X-ray radiation are shown and discussed. The high brightness of the laser produced plasma soft X-ray source and the low penetration depth of the produced X-ray radiation in biological specimen allows a high dose rate to be delivered to the specimen of over 28 Gy/shot; and 280 Gy/s at the maximum repetition rate of the laser system. The source has a unique capability for irradiation of cells with high pulse dose both in vacuum and He-environment. Demonstration of the source to induce DNA double- and single strand breaks will be discussed.

Keywords: laser produced plasma, soft X-rays, radio biology experiments, dosimetry

Procedia PDF Downloads 587
8104 ACO-TS: an ACO-based Algorithm for Optimizing Cloud Task Scheduling

Authors: Fahad Y. Al-dawish

Abstract:

The current trend by a large number of organizations and individuals to use cloud computing. Many consider it a significant shift in the field of computing. Cloud computing are distributed and parallel systems consisting of a collection of interconnected physical and virtual machines. With increasing request and profit of cloud computing infrastructure, diverse computing processes can be executed on cloud environment. Many organizations and individuals around the world depend on the cloud computing environments infrastructure to carry their applications, platform, and infrastructure. One of the major and essential issues in this environment related to allocating incoming tasks to suitable virtual machine (cloud task scheduling). Cloud task scheduling is classified as optimization problem, and there are several meta-heuristic algorithms have been anticipated to solve and optimize this problem. Good task scheduler should execute its scheduling technique on altering environment and the types of incoming task set. In this research project a cloud task scheduling methodology based on ant colony optimization ACO algorithm, we call it ACO-TS Ant Colony Optimization for Task Scheduling has been proposed and compared with different scheduling algorithms (Random, First Come First Serve FCFS, and Fastest Processor to the Largest Task First FPLTF). Ant Colony Optimization (ACO) is random optimization search method that will be used for assigning incoming tasks to available virtual machines VMs. The main role of proposed algorithm is to minimizing the makespan of certain tasks set and maximizing resource utilization by balance the load among virtual machines. The proposed scheduling algorithm was evaluated by using Cloudsim toolkit framework. Finally after analyzing and evaluating the performance of experimental results we find that the proposed algorithm ACO-TS perform better than Random, FCFS, and FPLTF algorithms in each of the makespaan and resource utilization.

Keywords: cloud Task scheduling, ant colony optimization (ACO), cloudsim, cloud computing

Procedia PDF Downloads 421
8103 Analyzing the Effectiveness of Different Testing Techniques in Ensuring Software Quality

Authors: R. M. P. C. Bandara, M. L. L. Weerasinghe, K. T. C. R. Kumari, A. G. D. R. Hansika, D. I. De Silva, D. M. T. H. Dias

Abstract:

Software testing is an essential process in software development that aims to identify defects and ensure that software is functioning as intended. Various testing techniques are employed to achieve this goal, but the effectiveness of these techniques varies. This research paper analyzes the effectiveness of different testing techniques in ensuring software quality. The paper explores different testing techniques, including manual and automated testing, and evaluates their effectiveness in terms of identifying defects, reducing the number of defects in software, and ensuring that software meets its functional and non-functional requirements. Moreover, the paper will also investigate the impact of factors such as testing time, test coverage, and testing environment on the effectiveness of these techniques. This research aims to provide valuable insights into the effectiveness of different testing techniques, enabling software development teams to make informed decisions about the testing approach that is best suited to their needs. By improving testing techniques, the number of defects in software can be reduced, enhancing the quality of software and ultimately providing better software for users.

Keywords: software testing life cycle, software testing techniques, software testing strategies, effectiveness, software quality

Procedia PDF Downloads 84
8102 A Type-2 Fuzzy Model for Link Prediction in Social Network

Authors: Mansoureh Naderipour, Susan Bastani, Mohammad Fazel Zarandi

Abstract:

Predicting links that may occur in the future and missing links in social networks is an attractive problem in social network analysis. Granular computing can help us to model the relationships between human-based system and social sciences in this field. In this paper, we present a model based on granular computing approach and Type-2 fuzzy logic to predict links regarding nodes’ activity and the relationship between two nodes. Our model is tested on collaboration networks. It is found that the accuracy of prediction is significantly higher than the Type-1 fuzzy and crisp approach.

Keywords: social network, link prediction, granular computing, type-2 fuzzy sets

Procedia PDF Downloads 326
8101 IT Skills and Soft Skills for Accountants in Thailand

Authors: Manirath Wongsim

Abstract:

Information technology management has become important for the achievement of organisations. An increase in the pace of technological change has revolutionised the way accountants perform their jobs. In response to this challenge, the identification of a new comprehensive set of information technology competencies combined with information technology skills and other skills (namely, soft skills) are necessary. Thus, this study aims to investigate IT competencies among professional accountants to enhance firm performance. This research was conducted with 42 respondents at ten organisations in Thailand. This research used qualitative, interpretive evidence.The results indicate that the factor IT competencies within the organizational issues defines19 factors. Specifically, these new factors, based on the research findings and the literature and unique to IT competences for professional accountants, include ERP software skills, BI software skills and accounting law and legal skills. The evidence in this study suggests that ERP software, spreadsheets, BI software and accounting software were ranked as much-needed skills to be acquired by accountants while communication skills were ranked as the most required skills, and delegation skills as the least required. The findings of the research’s empirical evidence suggest that organizations should understand appropriate into developing information technology related competencies for knowledge workers in general and professional accountants in particular and provide assistance in all processes of decision making.

Keywords: IT competencies, IT competencies for accountants, IT skills for accounting, soft skills for accountants

Procedia PDF Downloads 415
8100 Using High Performance Computing for Online Flood Monitoring and Prediction

Authors: Stepan Kuchar, Martin Golasowski, Radim Vavrik, Michal Podhoranyi, Boris Sir, Jan Martinovic

Abstract:

The main goal of this article is to describe the online flood monitoring and prediction system Floreon+ primarily developed for the Moravian-Silesian region in the Czech Republic and the basic process it uses for running automatic rainfall-runoff and hydrodynamic simulations along with their calibration and uncertainty modeling. It takes a long time to execute such process sequentially, which is not acceptable in the online scenario, so the use of high-performance computing environment is proposed for all parts of the process to shorten their duration. Finally, a case study on the Ostravice river catchment is presented that shows actual durations and their gain from the parallel implementation.

Keywords: flood prediction process, high performance computing, online flood prediction system, parallelization

Procedia PDF Downloads 493