Search results for: evolved bat algorithm
2237 Optimizing CNC Production Line Efficiency Using NSGA-II: Adaptive Layout and Operational Sequence for Enhanced Manufacturing Flexibility
Authors: Yi-Ling Chen, Dung-Ying Lin
Abstract:
In the manufacturing process, computer numerical control (CNC) machining plays a crucial role. CNC enables precise machinery control through computer programs, achieving automation in the production process and significantly enhancing production efficiency. However, traditional CNC production lines often require manual intervention for loading and unloading operations, which limits the production line's operational efficiency and production capacity. Additionally, existing CNC automation systems frequently lack sufficient intelligence and fail to achieve optimal configuration efficiency, resulting in the need for substantial time to reconfigure production lines when producing different products, thereby impacting overall production efficiency. Using the NSGA-II algorithm, we generate production line layout configurations that consider field constraints and select robotic arm specifications from an arm list. This allows us to calculate loading and unloading times for each job order, perform demand allocation, and assign processing sequences. The NSGA-II algorithm is further employed to determine the optimal processing sequence, with the aim of minimizing demand completion time and maximizing average machine utilization. These objectives are used to evaluate the performance of each layout, ultimately determining the optimal layout configuration. By employing this method, it enhance the configuration efficiency of CNC production lines and establish an adaptive capability that allows the production line to respond promptly to changes in demand. This will minimize production losses caused by the need to reconfigure the layout, ensuring that the CNC production line can maintain optimal efficiency even when adjustments are required due to fluctuating demands.Keywords: evolutionary algorithms, multi-objective optimization, pareto optimality, layout optimization, operations sequence
Procedia PDF Downloads 182236 Reinforcement-Learning Based Handover Optimization for Cellular Unmanned Aerial Vehicles Connectivity
Authors: Mahmoud Almasri, Xavier Marjou, Fanny Parzysz
Abstract:
The demand for services provided by Unmanned Aerial Vehicles (UAVs) is increasing pervasively across several sectors including potential public safety, economic, and delivery services. As the number of applications using UAVs grows rapidly, more and more powerful, quality of service, and power efficient computing units are necessary. Recently, cellular technology draws more attention to connectivity that can ensure reliable and flexible communications services for UAVs. In cellular technology, flying with a high speed and altitude is subject to several key challenges, such as frequent handovers (HOs), high interference levels, connectivity coverage holes, etc. Additional HOs may lead to “ping-pong” between the UAVs and the serving cells resulting in a decrease of the quality of service and energy consumption. In order to optimize the number of HOs, we develop in this paper a Q-learning-based algorithm. While existing works focus on adjusting the number of HOs in a static network topology, we take into account the impact of cells deployment for three different simulation scenarios (Rural, Semi-rural and Urban areas). We also consider the impact of the decision distance, where the drone has the choice to make a switching decision on the number of HOs. Our results show that a Q-learning-based algorithm allows to significantly reduce the average number of HOs compared to a baseline case where the drone always selects the cell with the highest received signal. Moreover, we also propose which hyper-parameters have the largest impact on the number of HOs in the three tested environments, i.e. Rural, Semi-rural, or Urban.Keywords: drones connectivity, reinforcement learning, handovers optimization, decision distance
Procedia PDF Downloads 1082235 Breast Cancer Metastasis Detection and Localization through Transfer-Learning Convolutional Neural Network Classification Based on Convolutional Denoising Autoencoder Stack
Authors: Varun Agarwal
Abstract:
Introduction: With the advent of personalized medicine, histopathological review of whole slide images (WSIs) for cancer diagnosis presents an exceedingly time-consuming, complex task. Specifically, detecting metastatic regions in WSIs of sentinel lymph node biopsies necessitates a full-scanned, holistic evaluation of the image. Thus, digital pathology, low-level image manipulation algorithms, and machine learning provide significant advancements in improving the efficiency and accuracy of WSI analysis. Using Camelyon16 data, this paper proposes a deep learning pipeline to automate and ameliorate breast cancer metastasis localization and WSI classification. Methodology: The model broadly follows five stages -region of interest detection, WSI partitioning into image tiles, convolutional neural network (CNN) image-segment classifications, probabilistic mapping of tumor localizations, and further processing for whole WSI classification. Transfer learning is applied to the task, with the implementation of Inception-ResNetV2 - an effective CNN classifier that uses residual connections to enhance feature representation, adding convolved outputs in the inception unit to the proceeding input data. Moreover, in order to augment the performance of the transfer learning CNN, a stack of convolutional denoising autoencoders (CDAE) is applied to produce embeddings that enrich image representation. Through a saliency-detection algorithm, visual training segments are generated, which are then processed through a denoising autoencoder -primarily consisting of convolutional, leaky rectified linear unit, and batch normalization layers- and subsequently a contrast-normalization function. A spatial pyramid pooling algorithm extracts the key features from the processed image, creating a viable feature map for the CNN that minimizes spatial resolution and noise. Results and Conclusion: The simplified and effective architecture of the fine-tuned transfer learning Inception-ResNetV2 network enhanced with the CDAE stack yields state of the art performance in WSI classification and tumor localization, achieving AUC scores of 0.947 and 0.753, respectively. The convolutional feature retention and compilation with the residual connections to inception units synergized with the input denoising algorithm enable the pipeline to serve as an effective, efficient tool in the histopathological review of WSIs.Keywords: breast cancer, convolutional neural networks, metastasis mapping, whole slide images
Procedia PDF Downloads 1292234 Artificial Intelligence Based Online Monitoring System for Cardiac Patient
Authors: Syed Qasim Gilani, Muhammad Umair, Muhammad Noman, Syed Bilawal Shah, Aqib Abbasi, Muhammad Waheed
Abstract:
Cardiovascular Diseases(CVD's) are the major cause of death in the world. The main reason for these deaths is the unavailability of first aid for heart failure. In many cases, patients die before reaching the hospital. We in this paper are presenting innovative online health service for Cardiac Patients. The proposed online health system has two ends. Users through device developed by us can communicate with their doctor through a mobile application. This interface provides them with first aid.Also by using this service, they have an easy interface with their doctors for attaining medical advice. According to the proposed system, we developed a device called Cardiac Care. Cardiac Care is a portable device which a patient can use at their home for monitoring heart condition. When a patient checks his/her heart condition, Electrocardiogram (ECG), Blood Pressure(BP), Temperature are sent to the central database. The severity of patients condition is checked using Artificial Intelligence Algorithm at the database. If the patient is suffering from the minor problem, our algorithm will suggest a prescription for patients. But if patient's condition is severe, patients record is sent to doctor through the mobile Android application. Doctor after reviewing patients condition suggests next step. If a doctor identifies the patient condition as critical, then the message is sent to the central database for sending an ambulance for the patient. Ambulance starts moving towards patient for bringing him/her to hospital. We have implemented this model at prototype level. This model will be life-saving for millions of people around the globe. According to this proposed model patients will be in contact with their doctors all the time.Keywords: cardiovascular disease, classification, electrocardiogram, blood pressure
Procedia PDF Downloads 1832233 Improvement of Data Transfer over Simple Object Access Protocol (SOAP)
Authors: Khaled Ahmed Kadouh, Kamal Ali Albashiri
Abstract:
This paper presents a designed algorithm involves improvement of transferring data over Simple Object Access Protocol (SOAP). The aim of this work is to establish whether using SOAP in exchanging XML messages has any added advantages or not. The results showed that XML messages without SOAP take longer time and consume more memory, especially with binary data.Keywords: JAX-WS, SMTP, SOAP, web service, XML
Procedia PDF Downloads 4932232 Entrepreneur Universal Education System: Future Evolution
Authors: Khaled Elbehiery, Hussam Elbehiery
Abstract:
The success of education is dependent on evolution and adaptation, while the traditional system has worked before, one type of education evolved with the digital age is virtual education that has influenced efficiency in today’s learning environments. Virtual learning has indeed proved its efficiency to overcome the drawbacks of the physical environment such as time, facilities, location, etc., but despite what it had accomplished, the educational system over all is not adequate for being a productive system yet. Earning a degree is not anymore enough to obtain a career job; it is simply missing the skills and creativity. There are always two sides of a coin; a college degree or a specialized certificate, each has its own merits, but having both can put you on a successful IT career path. For many of job-seeking individuals across world to have a clear meaningful goal for work and education and positively contribute the community, a productive correlation and cooperation among employers, universities alongside with the individual technical skills is a must for generations to come. Fortunately, the proposed research “Entrepreneur Universal Education System” is an evolution to meet the needs of both employers and students, in addition to gaining vital and real-world experience in the chosen fields is easier than ever. The new vision is to empower the education to improve organizations’ needs which means improving the world as its primary goal, adopting universal skills of effective thinking, effective action, effective relationships, preparing the students through real-world accomplishment and encouraging them to better serve their organization and their communities faster and more efficiently.Keywords: virtual education, academic degree, certificates, internship, amazon web services, Microsoft Azure, Google Cloud Platform, hybrid models
Procedia PDF Downloads 952231 Non-Destructive Static Damage Detection of Structures Using Genetic Algorithm
Authors: Amir Abbas Fatemi, Zahra Tabrizian, Kabir Sadeghi
Abstract:
To find the location and severity of damage that occurs in a structure, characteristics changes in dynamic and static can be used. The non-destructive techniques are more common, economic, and reliable to detect the global or local damages in structures. This paper presents a non-destructive method in structural damage detection and assessment using GA and static data. Thus, a set of static forces is applied to some of degrees of freedom and the static responses (displacements) are measured at another set of DOFs. An analytical model of the truss structure is developed based on the available specification and the properties derived from static data. The damages in structure produce changes to its stiffness so this method used to determine damage based on change in the structural stiffness parameter. Changes in the static response which structural damage caused choose to produce some simultaneous equations. Genetic Algorithms are powerful tools for solving large optimization problems. Optimization is considered to minimize objective function involve difference between the static load vector of damaged and healthy structure. Several scenarios defined for damage detection (single scenario and multiple scenarios). The static damage identification methods have many advantages, but some difficulties still exist. So it is important to achieve the best damage identification and if the best result is obtained it means that the method is Reliable. This strategy is applied to a plane truss. This method is used for a plane truss. Numerical results demonstrate the ability of this method in detecting damage in given structures. Also figures show damage detections in multiple damage scenarios have really efficient answer. Even existence of noise in the measurements doesn’t reduce the accuracy of damage detections method in these structures.Keywords: damage detection, finite element method, static data, non-destructive, genetic algorithm
Procedia PDF Downloads 2342230 Advanced Mouse Cursor Control and Speech Recognition Module
Authors: Prasad Kalagura, B. Veeresh kumar
Abstract:
We constructed an interface system that would allow a similarly paralyzed user to interact with a computer with almost full functional capability. A real-time tracking algorithm is implemented based on adaptive skin detection and motion analysis. The clicking of the mouse is activated by the user's eye blinking through a sensor. The keyboard function is implemented by voice recognition kit.Keywords: embedded ARM7 processor, mouse pointer control, voice recognition
Procedia PDF Downloads 5762229 Continuity of Place-Identity: Identifying Regional Components of Kerala Architecture through 1805-1950
Authors: Manoj K. Kumar, Deepthi Bathala
Abstract:
Man has the need to know and feel as a part of the historical continuum and it is this continuum that reinforces his identity. Architecture and the built environment contribute to this identity as established by the various identity theories exploring the relationship between the two. Architecture which is organic has been successful in maintaining a continuum of identity until the advent of globalization when the world saw a drastic shift to architecture of ‘placelessness’. The answer to the perfect synthesis of ‘universalization’ and ‘regionalism’ is an ongoing quest. However, history has established a smooth transition from vernacular to colonial to modern unlike the architecture of today. The traditional Kerala architecture has evolved from the tropical climate, geography, local needs, materials, skills and foreign influences. It is unique in contrast to the architecture of the neighboring states as a result of the geographical barriers however influenced by the architecture of the Orient due to trade relations. Through 1805 to 1950, the European influence on the architecture of Kerala resulted in the emergence of the colonial style which managed to establish a continuum of the traditional architecture. The paper focuses on the identification of the components of architecture that established the continuity of place-identity in the architecture of Kerala and examines the transition from the traditional Kerala architecture to colonial architecture during the colonial period. Visual surveys based on the principles of urban design, cognitive mapping, typology analysis followed by the strong understanding of the morphological and built environment along with the matrix method are the research tools used. The understanding of these components of continuity can be useful in creating buildings which people can relate to in the present day. South-Asia shares the history of colonialism and the understanding of these components can pave the way for further research on how to establish a regional identity in the era of globalization.Keywords: colonial, identity, place, regional
Procedia PDF Downloads 4072228 Hardware-In-The-Loop Relative Motion Control: Theory, Simulation and Experimentation
Authors: O. B. Iskender, K. V. Ling, V. Dubanchet, L. Simonini
Abstract:
This paper presents a Guidance and Control (G&C) strategy to address spacecraft maneuvering problem for future Rendezvous and Docking (RVD) missions. The proposed strategy allows safe and propellant efficient trajectories for space servicing missions including tasks such as approaching, inspecting and capturing. This work provides the validation test results of the G&C laws using a Hardware-In-the-Loop (HIL) setup with two robotic mockups representing the chaser and the target spacecraft. Through this paper, the challenges of the relative motion control in space are first summarized, and in particular, the constraints imposed by the mission, spacecraft and, onboard processing capabilities. Second, the proposed algorithm is introduced by presenting the formulation of constrained Model Predictive Control (MPC) to optimize the fuel consumption and explicitly handle the physical and geometric constraints in the system, e.g. thruster or Line-Of-Sight (LOS) constraints. Additionally, the coupling between translational motion and rotational motion is addressed via dual quaternion based kinematic description and accordingly explained. The resulting convex optimization problem allows real-time implementation capability based on a detailed discussion on the computational time requirements and the obtained results with respect to the onboard computer and future trends of space processors capabilities. Finally, the performance of the algorithm is presented in the scope of a potential future mission and of the available equipment. The results also cover a comparison between the proposed algorithms with Linear–quadratic regulator (LQR) based control law to highlight the clear advantages of the MPC formulation.Keywords: autonomous vehicles, embedded optimization, real-time experiment, rendezvous and docking, space robotics
Procedia PDF Downloads 1242227 Comparison of Deep Learning and Machine Learning Algorithms to Diagnose and Predict Breast Cancer
Authors: F. Ghazalnaz Sharifonnasabi, Iman Makhdoom
Abstract:
Breast cancer is a serious health concern that affects many people around the world. According to a study published in the Breast journal, the global burden of breast cancer is expected to increase significantly over the next few decades. The number of deaths from breast cancer has been increasing over the years, but the age-standardized mortality rate has decreased in some countries. It’s important to be aware of the risk factors for breast cancer and to get regular check- ups to catch it early if it does occur. Machin learning techniques have been used to aid in the early detection and diagnosis of breast cancer. These techniques, that have been shown to be effective in predicting and diagnosing the disease, have become a research hotspot. In this study, we consider two deep learning approaches including: Multi-Layer Perceptron (MLP), and Convolutional Neural Network (CNN). We also considered the five-machine learning algorithm titled: Decision Tree (C4.5), Naïve Bayesian (NB), Support Vector Machine (SVM), K-Nearest Neighbors (KNN) Algorithm and XGBoost (eXtreme Gradient Boosting) on the Breast Cancer Wisconsin Diagnostic dataset. We have carried out the process of evaluating and comparing classifiers involving selecting appropriate metrics to evaluate classifier performance and selecting an appropriate tool to quantify this performance. The main purpose of the study is predicting and diagnosis breast cancer, applying the mentioned algorithms and also discovering of the most effective with respect to confusion matrix, accuracy and precision. It is realized that CNN outperformed all other classifiers and achieved the highest accuracy (0.982456). The work is implemented in the Anaconda environment based on Python programing language.Keywords: breast cancer, multi-layer perceptron, Naïve Bayesian, SVM, decision tree, convolutional neural network, XGBoost, KNN
Procedia PDF Downloads 742226 Weight Estimation Using the K-Means Method in Steelmaking’s Overhead Cranes in Order to Reduce Swing Error
Authors: Seyedamir Makinejadsanij
Abstract:
One of the most important factors in the production of quality steel is to know the exact weight of steel in the steelmaking area. In this study, a calculation method is presented to estimate the exact weight of the melt as well as the objects transported by the overhead crane. Iran Alloy Steel Company's steelmaking area has three 90-ton cranes, which are responsible for transferring the ladles and ladle caps between 34 areas in the melt shop. Each crane is equipped with a Disomat Tersus weighing system that calculates and displays real-time weight. The moving object has a variable weight due to swinging, and the weighing system has an error of about +-5%. This means that when the object is moving by a crane, which weighs about 80 tons, the device (Disomat Tersus system) calculates about 4 tons more or 4 tons less, and this is the biggest problem in calculating a real weight. The k-means algorithm is an unsupervised clustering method that was used here. The best result was obtained by considering 3 centers. Compared to the normal average(one) or two, four, five, and six centers, the best answer is with 3 centers, which is logically due to the elimination of noise above and below the real weight. Every day, the standard weight is moved with working cranes to test and calibrate cranes. The results are shown that the accuracy is about 40 kilos per 60 tons (standard weight). As a result, with this method, the accuracy of moving weight is calculated as 99.95%. K-means is used to calculate the exact mean of objects. The stopping criterion of the algorithm is also the number of 1000 repetitions or not moving the points between the clusters. As a result of the implementation of this system, the crane operator does not stop while moving objects and continues his activity regardless of weight calculations. Also, production speed increased, and human error decreased.Keywords: k-means, overhead crane, melt weight, weight estimation, swing problem
Procedia PDF Downloads 882225 The Utility and the Consequences of Counter Terrorism Financing
Authors: Fatemah Alzubairi
Abstract:
Terrorism financing is a theme that dramatically evolved post-9/11. Supra-national bodies, above all UN Security Council and the Financial Action Task Form (FATF), have established an executive-like mechanism, which allows blacklisting individuals and groups, freezing their funds, and restricting their travel, all of which have become part of states’ anti-terrorism frameworks. A number of problems arise from building counter-terrorism measures on the foundation of a vague definition of terrorism. This paper examines the utility and consequences of counter-terrorism financing with considering the lack of an international definition of terrorism. The main problem with national and international anti-terrorism legislation is the lack of a clear objective definition of terrorism. Most, if not all, national laws are broad and vague. Determining what terrorism remains the crucial underpinning of any successful discussion of counter-terrorism, and of the future success of counter-terrorist measures. This paper focuses on the legal and political consequences of equalizing the treatment of violent terrorist crimes, such as bombing, with non-violent terrorism-related crimes, such as funding terrorist groups. While both sorts of acts requires criminalization, treating them equally risks wrongfully or unfairly condemning innocent people who have associated with “terrorists” but are not involved in terrorist activities. This paper examines whether global obligations to counter terrorism financing focus on controlling terrorist groups more than terrorist activities. It also examines the utility of the obligations adopted by the UN Security Council and FATF, and whether they serve global security; or whether the utility is largely restricted to Western security, with little attention paid to the unique needs and demands of other regions.Keywords: counter-terrorism, definition of terrorism, FATF, security, terrorism financing, UN Security Council
Procedia PDF Downloads 3232224 A Study on the Different Components of a Typical Back-Scattered Chipless RFID Tag Reflection
Authors: Fatemeh Babaeian, Nemai Chandra Karmakar
Abstract:
Chipless RFID system is a wireless system for tracking and identification which use passive tags for encoding data. The advantage of using chipless RFID tag is having a planar tag which is printable on different low-cost materials like paper and plastic. The printed tag can be attached to different items in the labelling level. Since the price of chipless RFID tag can be as low as a fraction of a cent, this technology has the potential to compete with the conventional optical barcode labels. However, due to the passive structure of the tag, data processing of the reflection signal is a crucial challenge. The captured reflected signal from a tag attached to an item consists of different components which are the reflection from the reader antenna, the reflection from the item, the tag structural mode RCS component and the antenna mode RCS of the tag. All these components are summed up in both time and frequency domains. The effect of reflection from the item and the structural mode RCS component can distort/saturate the frequency domain signal and cause difficulties in extracting the desired component which is the antenna mode RCS. Therefore, it is required to study the reflection of the tag in both time and frequency domains to have a better understanding of the nature of the captured chipless RFID signal. The other benefits of this study can be to find an optimised encoding technique in tag design level and to find the best processing algorithm the chipless RFID signal in decoding level. In this paper, the reflection from a typical backscattered chipless RFID tag with six resonances is analysed, and different components of the signal are separated in both time and frequency domains. Moreover, the time domain signal corresponding to each resonator of the tag is studied. The data for this processing was captured from simulation in CST Microwave Studio 2017. The outcome of this study is understanding different components of a measured signal in a chipless RFID system and a discovering a research gap which is a need to find an optimum detection algorithm for tag ID extraction.Keywords: antenna mode RCS, chipless RFID tag, resonance, structural mode RCS
Procedia PDF Downloads 1972223 A Radiomics Approach to Predict the Evolution of Prostate Imaging Reporting and Data System Score 3/5 Prostate Areas in Multiparametric Magnetic Resonance
Authors: Natascha C. D'Amico, Enzo Grossi, Giovanni Valbusa, Ala Malasevschi, Gianpiero Cardone, Sergio Papa
Abstract:
Purpose: To characterize, through a radiomic approach, the nature of areas classified PI-RADS (Prostate Imaging Reporting and Data System) 3/5, recognized in multiparametric prostate magnetic resonance with T2-weighted (T2w), diffusion and perfusion sequences with paramagnetic contrast. Methods and Materials: 24 cases undergoing multiparametric prostate MR and biopsy were admitted to this pilot study. Clinical outcome of the PI-RADS 3/5 was found through biopsy, finding 8 malignant tumours. The analysed images were acquired with a Philips achieva 1.5T machine with a CE- T2-weighted sequence in the axial plane. Semi-automatic tumour segmentation was carried out on MR images using 3DSlicer image analysis software. 45 shape-based, intensity-based and texture-based features were extracted and represented the input for preprocessing. An evolutionary algorithm (a TWIST system based on KNN algorithm) was used to subdivide the dataset into training and testing set and select features yielding the maximal amount of information. After this pre-processing 20 input variables were selected and different machine learning systems were used to develop a predictive model based on a training testing crossover procedure. Results: The best machine learning system (three-layers feed-forward neural network) obtained a global accuracy of 90% ( 80 % sensitivity and 100% specificity ) with a ROC of 0.82. Conclusion: Machine learning systems coupled with radiomics show a promising potential in distinguishing benign from malign tumours in PI-RADS 3/5 areas.Keywords: machine learning, MR prostate, PI-Rads 3, radiomics
Procedia PDF Downloads 1862222 Irrigation Challenges, Climate Change Adaptation and Sustainable Water Usage in Developing Countries. A Case Study, Nigeria
Authors: Faith Eweluegim Enahoro-Ofagbe
Abstract:
Worldwide, every nation is experiencing the effects of global warming. In developing countries, due to the heavy reliance on agriculture for socioeconomic growth and security, among other things, these countries are more affected by climate change, particularly with the availability of water. Floods, droughts, rising temperatures, saltwater intrusion, groundwater depletion, and other severe environmental alterations are all brought on by climatic change. Life depends on water, a vital resource; these ecological changes affect all water use, including agriculture and household water use. Therefore adequate and adaptive water usage strategies for sustainability are essential in developing countries. Therefore, this paper investigates Nigeria's challenges due to climate change and adaptive techniques that have evolved in response to such issues to ensure water management and sustainability for irrigation and provide quality water to residents. Questionnaires were distributed to respondents in the study area, central Nigeria, for quantitative evaluation of sustainable water resource management techniques. Physicochemical analysis was done, collecting soil and water samples from several locations under investigation. Findings show that farmers use different methods, ranging from intelligent technologies to traditional strategies for water resource management. Also, farmers need to learn better water resource management techniques for sustainability. Since more residents obtain their water from privately held sources, the government should enforce legislation to ensure that private borehole construction businesses treat water sources of poor quality before the general public uses them.Keywords: developing countries, irrigation, strategies, sustainability, water resource management, water usage
Procedia PDF Downloads 1142221 The International Constitutional Order and Elements of Human Rights
Authors: Girma Y. Iyassu Menelik
Abstract:
“The world is now like a global village!” so goes the saying that shows that due to development and technology the countries of the world are now closely linked. In the field of Human rights there is a close relationship in the way that rights are recognised and enforced. This paper will show that human rights have evolved from ancient times through important landmarks such as the Magna Carta, the French Declaration of Rights of Man and of the Citizen and the American Bill of Rights. The formation of the United Nations after the Second World War resulted in the need to codify and protect human rights. There are some rights which are so fundamental that they are found in international and continental instruments, national constitutions and domestic legislation. In the civil and political sphere they include the right to vote, to freedom of association, speech and assembly, right to life, privacy and fair trial. In the economic and social sphere you have the right to work, protection of the family, social security and rights to education, health and shelter. In some instance some rights can be suspended in times of public emergency but such derogations shall be circumscribed by the law and in most constitutions such limitations are subject to judicial review. However, some rights are so crucial that they cannot be derogated from under any circumstances and these include the right to life, recognition before the law, freedom from torture and slavery and of thought, conscience and religion. International jurisprudence has been developed to protect fundamental rights and avoid discrimination on the grounds of race, colour, sex, language or social origin. The elaborate protection system go to show that these rights have become part of the international order and they have universal application. We have now got to a stage where UDHR, ICCPR and ICESCR and have come to be regarded as part of an international bill of rights with horizontal and vertical enforcement mechanisms involving state parties, NGO’s , international bodies and other organs.Keywords: rights, international, constitutional, state, judiciary
Procedia PDF Downloads 4512220 Logical-Probabilistic Modeling of the Reliability of Complex Systems
Authors: Sergo Tsiramua, Sulkhan Sulkhanishvili, Elisabed Asabashvili, Lazare Kvirtia
Abstract:
The paper presents logical-probabilistic methods, models, and algorithms for reliability assessment of complex systems, based on which a web application for structural analysis and reliability assessment of systems was created. It is important to design systems based on structural analysis, research, and evaluation of efficiency indicators. One of the important efficiency criteria is the reliability of the system, which depends on the components of the structure. Quantifying the reliability of large-scale systems is a computationally complex process, and it is advisable to perform it with the help of a computer. Logical-probabilistic modeling is one of the effective means of describing the structure of a complex system and quantitatively evaluating its reliability, which was the basis of our application. The reliability assessment process included the following stages, which were reflected in the application: 1) Construction of a graphical scheme of the structural reliability of the system; 2) Transformation of the graphic scheme into a logical representation and modeling of the shortest ways of successful functioning of the system; 3) Description of system operability condition with logical function in the form of disjunctive normal form (DNF); 4) Transformation of DNF into orthogonal disjunction normal form (ODNF) using the orthogonalization algorithm; 5) Replacing logical elements with probabilistic elements in ODNF, obtaining a reliability estimation polynomial and quantifying reliability; 6) Calculation of “weights” of elements of system. Using the logical-probabilistic methods, models and algorithms discussed in the paper, a special software was created, by means of which a quantitative assessment of the reliability of systems of a complex structure is produced. As a result, structural analysis of systems, research, and designing of optimal structure systems are carried out.Keywords: complex systems, logical-probabilistic methods, orthogonalization algorithm, reliability of systems, “weights” of elements
Procedia PDF Downloads 652219 The Development Stages of Transformation of Water Policy Management in Victoria
Authors: Ratri Werdiningtyas, Yongping Wei, Andrew Western
Abstract:
The status quo of social-ecological systems is the results of not only natural processes but also the accumulated consequence of policies applied in the past. Often water management objectives are challenging and are only achieved to a limited degree on the ground. In choosing water management approaches, it is important to account for current conditions and important differences due to varied histories. Since the mid-nineteenth century, Victorian water management has evolved through a series of policy regime shifts. The main goal of this research to explore and identify the stages of the evolution of the water policy instruments as practiced in Victoria from 1890-2016. This comparative historical analysis has identified four stages in Victorian policy instrument development. In the first stage, the creation of policy instruments aimed to match the demand and supply of the resource (reserve condition). The second stage begins after natural system alone failed to balance supply and demand. The focus of the policy instrument shifted to an authority perspective in this stage. Later, the increasing number of actors interested in water led to another change in policy instrument. The third stage focused on the significant role of information from different relevant actors. The fourth and current stage is the most advanced, in that it involved the creation of a policy instrument for synergizing the previous three focal factors: reserve, authority, and information. When considering policy in other jurisdiction, these findings suggest that a key priority should be to reflect on the jurisdictions current position among these four evolutionary stages and try to make improve progressively rather than directly adopting approaches from elsewhere without understanding the current position.Keywords: policy instrument, policy transformation, socio-ecolgical system, water management
Procedia PDF Downloads 1432218 Trade Openness, Productivity Growth And Economic Growth: Nigeria’s Experience
Authors: S. O. Okoro
Abstract:
Some words become the catch phrase of a particular decade. Globalization, Openness, and Privatization are certainly among the most frequently encapsulation of 1990’s; the market is ‘in’, ‘the state is out’. In the 1970’s, there were many political economists who spoke of autarky as one possible response to global economic forces. Be self-contained, go it alone, put up barriers to trans-nationalities, put in place import-substitution industrialization policy and grow domestic industries. In 1990’s, the emasculation of the state is by no means complete, but there is an acceptance that the state’s power is circumscribed by forces beyond its control and potential leverage. Autarky is no longer as a policy option. Nigeria, since its emergence as an independent nation, has evolved two macroeconomic management regimes of the interventionist and market friendly styles. This paper investigates Nigeria’s growth performance over the periods incorporating these two regimes and finds that there is no structural break in Total Factor Productivity, (TFP) growth and besides, the TFP growth over the entire period of study 1970-2012 is very negligible and hence growth can only be achieved by the unsustainable factor accumulation. Another important finding of this work is that the openness-human capital interaction term has a significant impact on the TFP growth, but the sign of the estimated coefficient does not meet it a theoretical expectation. This is because the negative coefficient on the human capital outweighs the positive openness effect. The poor quality of human capital is considered to have given rise to this. Given these results a massive investment in the education sector is required. The investment should be targeted at reforms that go beyond mere structural reforms to a reform agenda that will improve the quality of human capital in Nigeria.Keywords: globalization, emasculation, openness and privatization, total factor productivity
Procedia PDF Downloads 2412217 A Case-Study Analysis on the Necessity of Testing for Cyber Risk Mitigation on Maritime Transport
Authors: Polychronis Kapalidis
Abstract:
In recent years, researchers have started to turn their attention to cyber security and maritime security independently, neglecting, in most cases, to examine the areas where these two critical issues are intertwined. The impact of cybersecurity issues on the maritime economy is emerging dramatically. Maritime transport and all related activities are conducted by technology-intensive platforms, which today rely heavily on information systems. The paper’s argument is that when no defense is completely effective against cyber attacks, it is vital to test responses to the inevitable incursions. Hence, preparedness in the form of testing existing cybersecurity structure via different tools for potential attacks is vital for minimizing risks. Traditional criminal activities may further be facilitated and evolved through the misuse of cyberspace. Kidnap, piracy, fraud, theft of cargo and imposition of ransomware are the major of these activities that mainly target the industry’s most valuable asset; the ship. The paper, adopting a case-study analysis, based on stakeholder consultation and secondary data analysis, namely policy and strategic-related documentation, presents the importance of holistic testing in the sector. Arguing that poor understanding of the issue leads to the adoption of ineffective policies the paper will present the level of awareness within the industry and assess the risks and vulnerabilities of ships to these cybercriminal activities. It will conclude by suggesting that testing procedures must be focused on three main pillars within the maritime transport sector: the human factor, the infrastructure, and the procedures.Keywords: cybercrime, cybersecurity, organized crime, risk mitigation
Procedia PDF Downloads 1562216 Category-Base Theory of the Optimum Signal Approximation Clarifying the Importance of Parallel Worlds in the Recognition of Human and Application to Secure Signal Communication with Feedback
Authors: Takuro Kida, Yuichi Kida
Abstract:
We show a base of the new trend of algorithm mathematically that treats a historical reason of continuous discrimination in the world as well as its solution by introducing new concepts of parallel world that includes an invisible set of errors as its companion. With respect to a matrix operator-filter bank that the matrix operator-analysis-filter bank H and the matrix operator-sampling-filter bank S are given, firstly, we introduce the detailed algorithm to derive the optimum matrix operator-synthesis-filter bank Z that minimizes all the worst-case measures of the matrix operator-error-signals E(ω) = F(ω) − Y(ω) between the matrix operator-input-signals F(ω) and the matrix operator-output signals Y(ω) of the matrix operator-filter bank at the same time. Further, feedback is introduced to the above approximation theory and it is indicated that introducing conversations with feedback does not superior automatically to the accumulation of existing knowledge of signal prediction. Secondly, the concept of category in the field of mathematics is applied to the above optimum signal approximation and is indicated that the category-based approximation theory is applied to the set-theoretic consideration of the recognition of humans. Based on this discussion, it is shown naturally why the narrow perception that tends to create isolation shows an apparent advantage in the short term and, often, why such narrow thinking becomes intimate with discriminatory action in a human group. Throughout these considerations, it is presented that, in order to abolish easy and intimate discriminatory behavior, it is important to create a parallel world of conception where we share the set of invisible error signals, including the words and the consciousness of both worlds.Keywords: signal prediction, pseudo inverse matrix, artificial intelligence, conditional optimization
Procedia PDF Downloads 1542215 A Preparatory Method for Building Construction Implemented in a Case Study in Brazil
Authors: Aline Valverde Arroteia, Tatiana Gondim do Amaral, Silvio Burrattino Melhado
Abstract:
During the last twenty years, the construction field in Brazil has evolved significantly in response to its market growing and competitiveness. However, this evolving path has faced many obstacles such as cultural barriers and the lack of efforts to achieve quality at the construction site. At the same time, the greatest amount of information generated on the designing or construction phases is lost due to the lack of an effective coordination of these activities. Face this problem, the aim of this research was to implement a French method named PEO which means preparation for building construction (in Portuguese) seeking to understand the design management process and its interface with the building construction phase. The research method applied was qualitative, and it was carried out through two case studies in the city of Goiania, in Goias, Brazil. The research was divided into two stages called pilot study at Company A and implementation of PEO at Company B. After the implementation; the results demonstrated the PEO method's effectiveness and feasibility while a booster on the quality improvement of design management. The analysis showed that the method has a purpose to improve the design and allow the reduction of failures, errors and rework commonly found in the production of buildings. Therefore, it can be concluded that the PEO is feasible to be applied to real estate and building companies. But, companies need to believe in the contribution they can make to the discovery of design failures in conjunction with other stakeholders forming a construction team. The result of PEO can be maximized when adopting the principles of simultaneous engineering and insertion of new computer technologies, which use a three-dimensional model of the building with BIM process.Keywords: communication, design and construction interface management, preparation for building construction (PEO), proactive coordination (CPA)
Procedia PDF Downloads 1572214 Low Overhead Dynamic Channel Selection with Cluster-Based Spatial-Temporal Station Reporting in Wireless Networks
Authors: Zeyad Abdelmageid, Xianbin Wang
Abstract:
Choosing the operational channel for a WLAN access point (AP) in WLAN networks has been a static channel assignment process initiated by the user during the deployment process of the AP, which fails to cope with the dynamic conditions of the assigned channel at the station side afterward. However, the dramatically growing number of Wi-Fi APs and stations operating in the unlicensed band has led to dynamic, distributed, and often severe interference. This highlights the urgent need for the AP to dynamically select the best overall channel of operation for the basic service set (BSS) by considering the distributed and changing channel conditions at all stations. Consequently, dynamic channel selection algorithms which consider feedback from the station side have been developed. Despite the significant performance improvement, existing channel selection algorithms suffer from very high feedback overhead. Feedback latency from the STAs, due to the high overhead, can cause the eventually selected channel to no longer be optimal for operation due to the dynamic sharing nature of the unlicensed band. This has inspired us to develop our own dynamic channel selection algorithm with reduced overhead through the proposed low-overhead, cluster-based station reporting mechanism. The main idea behind the cluster-based station reporting is the observation that STAs which are very close to each other tend to have very similar channel conditions. Instead of requesting each STA to report on every candidate channel while causing high overhead, the AP divides STAs into clusters then assigns each STA in each cluster one channel to report feedback on. With the proper design of the cluster based reporting, the AP does not lose any information about the channel conditions at the station side while reducing feedback overhead. The simulation results show equal performance and, at times, better performance with a fraction of the overhead. We believe that this algorithm has great potential in designing future dynamic channel selection algorithms with low overhead.Keywords: channel assignment, Wi-Fi networks, clustering, DBSCAN, overhead
Procedia PDF Downloads 1162213 Building Information Modelling Based Value for Money Assessment in Public-Private Partnership
Authors: Guoqian Ren, Haijiang Li, Jisong Zhang
Abstract:
Over the past 40 years, urban development has undergone large-scale, high-speed expansion, beyond what was previously considered normal and in a manner not proportionally related to population growth or physical considerations. With more scientific and refined decision-making in the urban construction process, new urbanization approaches, aligned with public-private partnerships (PPPs) which evolved in the early 1990s, have become acceptable and, in some situations, even better solutions to outstanding urban municipal construction projects, especially in developing countries. However, as the main driving force to deal with urban public services, PPPs are still problematic regarding value for money (VFM) process in most large-scale construction projects. This paper therefore reviews recent PPP articles in popular project management journals and relevant toolkits, published in the last 10 years, to identify the indicators that influence VFM within PPPs across regions. With increasing concerns about profitability and environmental and social impacts, the current PPP structure requires a more integrated platform to manage multi-performance project life cycles. Building information modelling (BIM), a popular approach to the procurement process in AEC sectors, provides the potential to ensure VFM while also working in tandem with the semantic approach to holistically measure life cycle costs (LCC) and achieve better sustainability. This paper suggests that BIM applied to the entire PPP life cycle could support holistic decision-making regarding VFM processes and thus meet service targets.Keywords: public-private partnership, value for money, building information modelling, semantic approach
Procedia PDF Downloads 2092212 An Analysis on Clustering Based Gene Selection and Classification for Gene Expression Data
Authors: K. Sathishkumar, V. Thiagarasu
Abstract:
Due to recent advances in DNA microarray technology, it is now feasible to obtain gene expression profiles of tissue samples at relatively low costs. Many scientists around the world use the advantage of this gene profiling to characterize complex biological circumstances and diseases. Microarray techniques that are used in genome-wide gene expression and genome mutation analysis help scientists and physicians in understanding of the pathophysiological mechanisms, in diagnoses and prognoses, and choosing treatment plans. DNA microarray technology has now made it possible to simultaneously monitor the expression levels of thousands of genes during important biological processes and across collections of related samples. Elucidating the patterns hidden in gene expression data offers a tremendous opportunity for an enhanced understanding of functional genomics. However, the large number of genes and the complexity of biological networks greatly increase the challenges of comprehending and interpreting the resulting mass of data, which often consists of millions of measurements. A first step toward addressing this challenge is the use of clustering techniques, which is essential in the data mining process to reveal natural structures and identify interesting patterns in the underlying data. This work presents an analysis of several clustering algorithms proposed to deals with the gene expression data effectively. The existing clustering algorithms like Support Vector Machine (SVM), K-means algorithm and evolutionary algorithm etc. are analyzed thoroughly to identify the advantages and limitations. The performance evaluation of the existing algorithms is carried out to determine the best approach. In order to improve the classification performance of the best approach in terms of Accuracy, Convergence Behavior and processing time, a hybrid clustering based optimization approach has been proposed.Keywords: microarray technology, gene expression data, clustering, gene Selection
Procedia PDF Downloads 3232211 An Exponential Field Path Planning Method for Mobile Robots Integrated with Visual Perception
Authors: Magdy Roman, Mostafa Shoeib, Mostafa Rostom
Abstract:
Global vision, whether provided by overhead fixed cameras, on-board aerial vehicle cameras, or satellite images can always provide detailed information on the environment around mobile robots. In this paper, an intelligent vision-based method of path planning and obstacle avoidance for mobile robots is presented. The method integrates visual perception with a new proposed field-based path-planning method to overcome common path-planning problems such as local minima, unreachable destination and unnecessary lengthy paths around obstacles. The method proposes an exponential angle deviation field around each obstacle that affects the orientation of a close robot. As the robot directs toward, the goal point obstacles are classified into right and left groups, and a deviation angle is exponentially added or subtracted to the orientation of the robot. Exponential field parameters are chosen based on Lyapunov stability criterion to guarantee robot convergence to the destination. The proposed method uses obstacles' shape and location, extracted from global vision system, through a collision prediction mechanism to decide whether to activate or deactivate obstacles field. In addition, a search mechanism is developed in case of robot or goal point is trapped among obstacles to find suitable exit or entrance. The proposed algorithm is validated both in simulation and through experiments. The algorithm shows effectiveness in obstacles' avoidance and destination convergence, overcoming common path planning problems found in classical methods.Keywords: path planning, collision avoidance, convergence, computer vision, mobile robots
Procedia PDF Downloads 1922210 A Multi-Modal Virtual Walkthrough of the Virtual Past and Present Based on Panoramic View, Crowd Simulation and Acoustic Heritage on Mobile Platform
Authors: Lim Chen Kim, Tan Kian Lam, Chan Yi Chee
Abstract:
This research presents a multi-modal simulation in the reconstruction of the past and the construction of present in digital cultural heritage on mobile platform. In bringing the present life, the virtual environment is generated through a presented scheme for rapid and efficient construction of 360° panoramic view. Then, acoustical heritage model and crowd model are presented and improvised into the 360° panoramic view. For the reconstruction of past life, the crowd is simulated and rendered in an old trading port. However, the keystone of this research is in a virtual walkthrough that shows the virtual present life in 2D and virtual past life in 3D, both in an environment of virtual heritage sites in George Town through mobile device. Firstly, the 2D crowd is modelled and simulated using OpenGL ES 1.1 on mobile platform. The 2D crowd is used to portray the present life in 360° panoramic view of a virtual heritage environment based on the extension of Newtonian Laws. Secondly, the 2D crowd is animated and rendered into 3D with improved variety and incorporated into the virtual past life using Unity3D Game Engine. The behaviours of the 3D models are then simulated based on the enhancement of the classical model of Boid algorithm. Finally, a demonstration system is developed and integrated with the models, techniques and algorithms of this research. The virtual walkthrough is demonstrated to a group of respondents and is evaluated through the user-centred evaluation by navigating around the demonstration system. The results of the evaluation based on the questionnaires have shown that the presented virtual walkthrough has been successfully deployed through a multi-modal simulation and such a virtual walkthrough would be particularly useful in a virtual tour and virtual museum applications.Keywords: Boid Algorithm, Crowd Simulation, Mobile Platform, Newtonian Laws, Virtual Heritage
Procedia PDF Downloads 2762209 Exploring the Intrinsic Ecology and Suitable Density of Historic Districts Through a Comparative Analysis of Ancient and Modern Ecological Smart Practices
Authors: HU Changjuan, Gong Cong, Long Hao
Abstract:
Although urban ecological policies and the public's aspiration for livable environments have expedited the pace of ecological revitalization, historic districts that have evolved through natural ecological processes often become obsolete and less habitable amid rapid urbanization. This raises a critical question about historic districts inherently incapable of being ecological and livable. The thriving concept of ‘intrinsic ecology,’ characterized by its ability to transform city-district systems into healthy ecosystems with diverse environments, stable functions, and rapid restoration capabilities, holds potential for guiding the integration of ancient and modern ecological wisdom while supporting the dynamic involvement of cultures. This study explores the intrinsic ecology of historic districts from three aspects: 1) Population Density: By comparing the population density before urban population expansion to the present day, determine the reasonable population density for historic districts. 2) Building Density: Using the ‘Space-mate’ tool for comparative analysis, form a spatial matrix to explore the intrinsic ecology of building density in Chinese historic districts. 3) Green Capacity Ratio: By using ecological districts as control samples, conduct dual comparative analyses (related comparison and upgraded comparison) to determine the intrinsic ecological advantages of the two-dimensional and three-dimensional green volume in historic districts. The study inform a density optimization strategy that supports cultural, social, natural, and economic ecology, contributing to the creation of eco-historic districts.Keywords: eco-historic districts, intrinsic ecology, suitable density, green capacity ratio.
Procedia PDF Downloads 212208 Heliport Remote Safeguard System Based on Real-Time Stereovision 3D Reconstruction Algorithm
Authors: Ł. Morawiński, C. Jasiński, M. Jurkiewicz, S. Bou Habib, M. Bondyra
Abstract:
With the development of optics, electronics, and computers, vision systems are increasingly used in various areas of life, science, and industry. Vision systems have a huge number of applications. They can be used in quality control, object detection, data reading, e.g., QR-code, etc. A large part of them is used for measurement purposes. Some of them make it possible to obtain a 3D reconstruction of the tested objects or measurement areas. 3D reconstruction algorithms are mostly based on creating depth maps from data that can be acquired from active or passive methods. Due to the specific appliance in airfield technology, only passive methods are applicable because of other existing systems working on the site, which can be blinded on most spectral levels. Furthermore, reconstruction is required to work long distances ranging from hundreds of meters to tens of kilometers with low loss of accuracy even with harsh conditions such as fog, rain, or snow. In response to those requirements, HRESS (Heliport REmote Safeguard System) was developed; which main part is a rotational head with a two-camera stereovision rig gathering images around the head in 360 degrees along with stereovision 3D reconstruction and point cloud combination. The sub-pixel analysis introduced in the HRESS system makes it possible to obtain an increased distance measurement resolution and accuracy of about 3% for distances over one kilometer. Ultimately, this leads to more accurate and reliable measurement data in the form of a point cloud. Moreover, the program algorithm introduces operations enabling the filtering of erroneously collected data in the point cloud. All activities from the programming, mechanical and optical side are aimed at obtaining the most accurate 3D reconstruction of the environment in the measurement area.Keywords: airfield monitoring, artificial intelligence, stereovision, 3D reconstruction
Procedia PDF Downloads 121