Search results for: healthcare data security
24731 Disaster Response Training Simulator Based on Augmented Reality, Virtual Reality, and MPEG-DASH
Authors: Sunho Seo, Younghwan Shin, Jong-Hong Park, Sooeun Song, Junsung Kim, Jusik Yun, Yongkyun Kim, Jong-Moon Chung
Abstract:
In order to effectively cope with large and complex disasters, disaster response training is needed. Recently, disaster response training led by the ROK (Republic of Korea) government is being implemented through a 4 year R&D project, which has several similar functions as the HSEEP (Homeland Security Exercise and Evaluation Program) of the United States, but also has several different features as well. Due to the unpredictiveness and diversity of disasters, existing training methods have many limitations in providing experience in the efficient use of disaster incident response and recovery resources. Always, the challenge is to be as efficient and effective as possible using the limited human and material/physical resources available based on the given time and environmental circumstances. To enable repeated training under diverse scenarios, an AR (Augmented Reality) and VR (Virtual Reality) combined simulator is under development. Unlike existing disaster response training, simulator based training (that allows remote login simultaneous multi-user training) enables freedom from limitations in time and space constraints, and can be repeatedly trained with different combinations of functions and disaster situations. There are related systems such as ADMS (Advanced Disaster Management Simulator) developed by ETC simulation and HLS2 (Homeland Security Simulation System) developed by ELBIT system. However, the ROK government needs a simulator custom made to the country's environment and disaster types, and also combines the latest information and communication technologies, which include AR, VR, and MPEG-DASH (Moving Picture Experts Group - Dynamic Adaptive Streaming over HTTP) technology. In this paper, a new disaster response training simulator is proposed to overcome the limitation of existing training systems, and adapted to actual disaster situations in the ROK, where several technical features are described.Keywords: augmented reality, emergency response training simulator, MPEG-DASH, virtual reality
Procedia PDF Downloads 30424730 Block Mining: Block Chain Enabled Process Mining Database
Authors: James Newman
Abstract:
Process mining is an emerging technology that looks to serialize enterprise data in time series data. It has been used by many companies and has been the subject of a variety of research papers. However, the majority of current efforts have looked at how to best create process mining from standard relational databases. This paper is the first pass at outlining a database custom-built for the minimal viable product of process mining. We present Block Miner, a blockchain protocol to store process mining data across a distributed network. We demonstrate the feasibility of storing process mining data on the blockchain. We present a proof of concept and show how the intersection of these two technologies helps to solve a variety of issues, including but not limited to ransomware attacks, tax documentation, and conflict resolution.Keywords: blockchain, process mining, memory optimization, protocol
Procedia PDF Downloads 11024729 Demographic Assessment and Evaluation of Degree of Lipid Control in High Risk Indian Dyslipidemia Patients
Authors: Abhijit Trailokya
Abstract:
Background: Cardiovascular diseases (CVD’s) are the major cause of morbidity and mortality in both developed and developing countries. Many clinical trials have demonstrated that low-density lipoprotein cholesterol (LDL-C) lowering, reduces the incidence of coronary and cerebrovascular events across a broad spectrum of patients at risk. Guidelines for the management of patients at risk have been established in Europe and North America. The guidelines have advocated progressively lower LDL-C targets and more aggressive use of statin therapy. In Indian patients, comprehensive data on dyslipidemia management and its treatment outcomes are inadequate. There is lack of information on existing treatment patterns, the patient’s profile being treated, and factors that determine treatment success or failure in achieving desired goals. Purpose: The present study was planned to determine the lipid control status in high-risk dyslipidemic patients treated with lipid-lowering therapy in India. Methods: This cross-sectional, non-interventional, single visit program was conducted across 483 sites in India where male and female patients with high-risk dyslipidemia aged 18 to 65 years who had visited for a routine health check-up to their respective physician at hospital or a healthcare center. Percentage of high-risk dyslipidemic patients achieving adequate LDL-C level (< 70 mg/dL) on lipid-lowering therapy and the association of lipid parameters with patient characteristics, comorbid conditions, and lipid lowering drugs were analysed. Results: 3089 patients were enrolled in the study; of which 64% were males. LDL-C data was available for 95.2% of the patients; only 7.7% of these patients achieved LDL-C levels < 70 mg/dL on lipid-lowering therapy, which may be due to inability to follow therapeutic plans, poor compliance, or inadequate counselling by physician. The physician’s lack of awareness about recent treatment guidelines also might contribute to patients’ poor adherence, not explaining adequately the benefit and risks of a medication, not giving consideration to the patient’s life style and the cost of medication. Statin was the most commonly used anti-dyslipidemic drug across population. The higher proportion of patients had the comorbid condition of CVD and diabetes mellitus across all dyslipidemic patients. Conclusion: As per the European Society of Cardiology guidelines the ideal LDL-C levels in high risk dyslipidemic patients should be less than 70%. In the present study, 7.7% of the patients achieved LDL-C levels < 70 mg/dL on lipid lowering therapy which is very less. Most of high risk dyslipidemic patients in India are on suboptimal dosage of statin. So more aggressive and high dosage statin therapy may be required to achieve target LDLC levels in high risk Indian dyslipidemic patients.Keywords: cardiovascular disease, diabetes mellitus, dyslipidemia, LDL-C, lipid lowering drug, statins
Procedia PDF Downloads 20524728 Vulnerability of Groundwater to Pollution in Akwa Ibom State, Southern Nigeria, using the DRASTIC Model and Geographic Information System (GIS)
Authors: Aniedi A. Udo, Magnus U. Igboekwe, Rasaaq Bello, Francis D. Eyenaka, Michael C. Ohakwere-Eze
Abstract:
Groundwater vulnerability to pollution was assessed in Akwa Ibom State, Southern Nigeria, with the aim of locating areas with high potentials for resource contamination, especially due to anthropogenic influence. The electrical resistivity method was utilized in the collection of the initial field data. Additional data input, which included depth to static water level, drilled well log data, aquifer recharge data, percentage slope, as well as soil information, were sourced from secondary sources. The initial field data were interpreted both manually and with computer modeling to provide information on the geoelectric properties of the subsurface. Interpreted results together with the secondary data were used to develop the DRASTIC thematic maps. A vulnerability assessment was performed using the DRASTIC model in a GIS environment and areas with high vulnerability which needed immediate attention was clearly mapped out and presented using an aquifer vulnerability map. The model was subjected to validation and the rate of validity was 73% within the area of study.Keywords: groundwater, vulnerability, DRASTIC model, pollution
Procedia PDF Downloads 21124727 Creation of S-Box in Blowfish Using AES
Authors: C. Rekha, G. N. Krishnamurthy
Abstract:
This paper attempts to develop a different approach for key scheduling algorithm which uses both Blowfish and AES algorithms. The main drawback of Blowfish algorithm is, it takes more time to create the S-box entries. To overcome this, we are replacing process of S-box creation in blowfish, by using key dependent S-box creation from AES without affecting the basic operation of blowfish. The method proposed in this paper uses good features of blowfish as well as AES and also this paper demonstrates the performance of blowfish and new algorithm by considering different aspects of security namely Encryption Quality, Key Sensitivity, and Correlation of horizontally adjacent pixels in an encrypted image.Keywords: AES, blowfish, correlation coefficient, encryption quality, key sensitivity, s-box
Procedia PDF Downloads 22724726 Rural Water Management Strategies and Irrigation Techniques for Sustainability. Nigeria Case Study; Kwara State
Authors: Faith Eweluegim Enahoro-Ofagbe
Abstract:
Water is essential for sustaining life. As a limited resource, effective water management is vital. Water scarcity has become more common due to the effects of climate change, land degradation, deforestation, and population growth, especially in rural communities, which are more susceptible to water-related issues such as water shortage, water-borne disease, et c., due to the unsuccessful implementation of water policies and projects in Nigeria. Since rural communities generate the majority of agricultural products, they significantly impact on water management for sustainability. The development of methods to advance this goal for residential and agricultural usage in the present and the future is a challenge for rural residents. This study evaluated rural water supply systems and irrigation management techniques to conserve water in Kwara State, North-Central Nigeria. Suggesting some measures to conserve water resources for sustainability, off-season farming, and socioeconomic security that will remedy water degradation, unemployment which is one of the causes of insecurity in the country, by considering the use of fabricated or locally made irrigation equipment, which are affordable by rural farmers, among other recommendations. Questionnaires were distributed to respondents in the study area for quantitative evaluation of irrigation methods practices. For physicochemical investigation, samples were also gathered from their available water sources. According to the study's findings, 30 percent of farmers adopted intelligent irrigation management techniques to conserve water resources, saving 45% of the water previously used for irrigation. 70 % of farmers practice seasonal farming. Irrigation water is drawn from river channels, streams, and unlined and unprotected wells. 60% of these rural residents rely on private boreholes for their water needs, while 40% rely on government-supplied rural water. Therefore, the government must develop additional water projects, raise awareness, and offer irrigation techniques that are simple to adapt for water management, increasing socio-economic productivity, security, and water sustainability.Keywords: water resource management, sustainability, irrigation, rural water management, irrigation management technique
Procedia PDF Downloads 11724725 Use of Alternative and Complementary Therapies in Patients with Chronic Pain in a Medical Institution in Medellin, Colombia, 2014
Authors: Lina María Martínez Sánchez, Juliana Molina Valencia, Esteban Vallejo Agudelo, Daniel Gallego González, María Isabel Pérez Palacio, Juan Ricardo Gaviria García, María De Los Ángeles Rodríguez Gázquez, Gloria Inés Martínez Domínguez
Abstract:
Alternative and complementary therapies constitute a vast and complex combination of interventions, philosophies, approaches, and therapies that acquire a holistic healthcare point of view, becoming an alternative for the treatment of patients with chronic pain. Objective: determine the characteristics of the use of alternative and complementary therapies in patients with chronic pain who consulted in a medical institution. Methodology: cross-sectional and descriptive study, with a population of patients that assisted to the outpatient consultation and met the eligibility criteria. Sampling was not conducted. A form was used for the collection of demographic and clinical variables and the Holistic Complementary and Alternative Medicine Questionnaire (HCAMQ) was validated. The analysis and processing of information was carried out using the SPSS program vr.19. Results: 220 people with chronic pain were included. The average age was 54.7±16.2 years, 78.2% were women, and 75.5% belonged to the socioeconomic strata 1 to 3. Musculoskeletal pain (77.7%), migraine (15%) and neuralgia (9.1%) were the most frequently types of chronic pain. 33.6% of participants have used some kind of alternative and complementary therapy; the most frequent were: homeopathy (14.5%), phytotherapy (12.7%), and acupuncture (11.4%). The total average HCAMQ score for the study group was 30.2±7.0 points, which shows a moderate attitude toward the use of complementary and alternative medicine. The highest scores according to the type of pain were: neuralgia (32.4±5.8), musculoskeletal pain (30.5±6.7), fibromyalgia (29.6±7.3) and migraine (28.5±8.8). The reliability of the HCAMQ was acceptable (Cronbach's α: 0.6). Conclusion: it was noted that the types of chronic pain and the clinical or therapeutic management of patients correspond to the data available in current literature. Despite the moderate attitude toward the use of these alternative and complementary therapies, one of every three patients uses them.Keywords: chronic pain, complementary therapies, homeopathy, acupuncture analgesia
Procedia PDF Downloads 51824724 Commercial Automobile Insurance: A Practical Approach of the Generalized Additive Model
Authors: Nicolas Plamondon, Stuart Atkinson, Shuzi Zhou
Abstract:
The insurance industry is usually not the first topic one has in mind when thinking about applications of data science. However, the use of data science in the finance and insurance industry is growing quickly for several reasons, including an abundance of reliable customer data, ferocious competition requiring more accurate pricing, etc. Among the top use cases of data science, we find pricing optimization, customer segmentation, customer risk assessment, fraud detection, marketing, and triage analytics. The objective of this paper is to present an application of the generalized additive model (GAM) on a commercial automobile insurance product: an individually rated commercial automobile. These are vehicles used for commercial purposes, but for which there is not enough volume to apply pricing to several vehicles at the same time. The GAM model was selected as an improvement over GLM for its ease of use and its wide range of applications. The model was trained using the largest split of the data to determine model parameters. The remaining part of the data was used as testing data to verify the quality of the modeling activity. We used the Gini coefficient to evaluate the performance of the model. For long-term monitoring, commonly used metrics such as RMSE and MAE will be used. Another topic of interest in the insurance industry is to process of producing the model. We will discuss at a high level the interactions between the different teams with an insurance company that needs to work together to produce a model and then monitor the performance of the model over time. Moreover, we will discuss the regulations in place in the insurance industry. Finally, we will discuss the maintenance of the model and the fact that new data does not come constantly and that some metrics can take a long time to become meaningful.Keywords: insurance, data science, modeling, monitoring, regulation, processes
Procedia PDF Downloads 7924723 The Effect of Technology on Human Rights Rules
Authors: Adel Fathy Sadek Abdalla
Abstract:
The issue of respect for human rights in Southeast Asia has become a major concern and is attracting the attention of the international community. Basically, the Association of Southeast Asian Nations (ASEAN) made human rights one of its main issues and in the ASEAN Charter in 2008. Subsequently, the Intergovernmental Commission on Human Rights ASEAN Human Rights (AICHR) was established. AICHR is the Southeast Asia Human Rights Enforcement Commission charged with the responsibilities, functions and powers to promote and protect human rights. However, at the end of 2016, the protective function assigned to the AICHR was not yet fulfilled. This is shown by several cases of human rights violations that are still ongoing and have not yet been solved. One case that has recently come to light is human rights violations against the Rohingya people in Myanmar. Using a legal-normative approach, the study examines the urgency of establishing a human rights tribunal in Southeast Asia capable of making a decision binding on ASEAN members or guilty parties. Data shows ASEAN needs regional courts to deal with human rights abuses in the ASEAN region. In addition, the study also highlights three important factors that ASEAN should consider when establishing a human rights tribunal, namely: Volume. a significant difference in terms of democracy and human rights development among the members, a consistent implementation of the principle of non-interference and the financial issue of the continuation of the court.Keywords: sustainable development, human rights, the right to development, the human rights-based approach to development, environmental rights, economic development, social sustainability human rights protection, human rights violations, workers’ rights, justice, security.
Procedia PDF Downloads 4624722 Journey of Silver Workers Post Retirement in India: An Exploratory Study
Authors: Avani Maniar, Shivani Mehta
Abstract:
Population aging is one of the most challenging issues of the twenty-first century, facing both developed and developing countries worldwide. In the developed world, there has already been a substantial amount of research on aging and work to help understand the capacity and potential of older people. They attract ever ones attention. Their existence in human society gives rise to variety of responses, reactions and apprehensions, because it connotes on greater part, to some kind of compulsion or willingness that prompt elderly to decide to work after retirement. Work due to social attention and assurance for security both economical and social. In this age, elderly aspire for psychological security with due attention. But the fact remains that despite age related limitations good number of persons in their age of sixty and beyond were hunting for work that would support them and get them some kind of support and in it turns helps them to remain physically and mentally active. Based on the existing diversities in the ageing process, it may be stated that there is a need to pay greater attention to the increasing awareness on the ageing issues and its socio-economic effects and to promote the development of policies and programmes for dealing with an ageing society. Addressing the needs, wants, and well-being of elderly people is essential for maintaining a healthy productive workforce in an aging society. This paper will draw on the results of the study about reasons of elderly working post retirement, problems faced by them and about the future of retirement to ask how widespread negative attitudes and stereotypes among employers are and whether these attitudes influence behavior towards older employees. The aim of research is not only to point out certain stereotypes concerning the elderly labour force, but also to stress that unless preconditions for overcoming these stereotypes are created and employment opportunities are given to this segment of the labour force, full employment as an ultimate goal of global economic policy cannot be achieved.Keywords: employers, India, inequality, problems, reasons of working, silver workers
Procedia PDF Downloads 16224721 Modeling Pan Evaporation Using Intelligent Methods of ANN, LSSVM and Tree Model M5 (Case Study: Shahroud and Mayamey Stations)
Authors: Hamidreza Ghazvinian, Khosro Ghazvinian, Touba Khodaiean
Abstract:
The importance of evaporation estimation in water resources and agricultural studies is undeniable. Pan evaporation are used as an indicator to determine the evaporation of lakes and reservoirs around the world due to the ease of interpreting its data. In this research, intelligent models were investigated in estimating pan evaporation on a daily basis. Shahroud and Mayamey were considered as the studied cities. These two cities are located in Semnan province in Iran. The mentioned cities have dry weather conditions that are susceptible to high evaporation potential. Meteorological data of 11 years of synoptic stations of Shahrood and Mayamey cities were used. The intelligent models used in this study are Artificial Neural Network (ANN), Least Squares Support Vector Machine (LSSVM), and M5 tree models. Meteorological parameters of minimum and maximum air temperature (Tmax, Tmin), wind speed (WS), sunshine hours (SH), air pressure (PA), relative humidity (RH) as selected input data and evaporation data from pan (EP) to The output data was considered. 70% of data is used at the education level, and 30 % of the data is used at the test level. Models used with explanation coefficient evaluation (R2) Root of Mean Squares Error (RMSE) and Mean Absolute Error (MAE). The results for the two Shahroud and Mayamey stations showed that the above three models' operations are rather appropriate.Keywords: pan evaporation, intelligent methods, shahroud, mayamey
Procedia PDF Downloads 8024720 Investigation of a Technology Enabled Model of Home Care: the eShift Model of Palliative Care
Authors: L. Donelle, S. Regan, R. Booth, M. Kerr, J. McMurray, D. Fitzsimmons
Abstract:
Palliative home health care provision within the Canadian context is challenged by: (i) a shortage of registered nurses (RN) and RNs with palliative care expertise, (ii) an aging population, (iii) reliance on unpaid family caregivers to sustain home care services with limited support to conduct this ‘care work’, (iv) a model of healthcare that assumes client self-care, and (v) competing economic priorities. In response, an interprofessional team of service provider organizations, a software/technology provider, and health care providers developed and implemented a technology-enabled model of home care, the eShift model of palliative home care (eShift). The eShift model combines communication and documentation technology with non-traditional utilization of health human resources to meet patient needs for palliative care in the home. The purpose of this study was to investigate the structure, processes, and outcomes of the eShift model of care. Methodology: Guided by Donebedian’s evaluation framework for health care, this qualitative-descriptive study investigated the structure, processes, and outcomes care of the eShift model of palliative home care. Interviews and focus groups were conducted with health care providers (n= 45), decision-makers (n=13), technology providers (n=3) and family care givers (n=8). Interviews were recorded, transcribed, and a deductive analysis of transcripts was conducted. Study Findings (1) Structure: The eShift model consists of a remotely-situated RN using technology to direct care provision virtually to patients in their home. The remote RN is connected virtually to a health technician (an unregulated care provider) in the patient’s home using real-time communication. The health technician uses a smartphone modified with the eShift application and communicates with the RN who uses a computer with the eShift application/dashboard. Documentation and communication about patient observations and care activities occur in the eShift portal. The RN is typically accountable for four to six health technicians and patients over an 8-hour shift. The technology provider was identified as an important member of the healthcare team. Other members of the team include family members, care coordinators, nurse practitioners, physicians, and allied health. (2) Processes: Conventionally, patient needs are the focus of care; however within eShift, the patient and the family caregiver were the focus of care. Enhanced medication administration was seen as one of the most important processes, and family caregivers reported high satisfaction with the care provided. There was perceived enhanced teamwork among health care providers. (3) Outcomes: Patients were able to die at home. The eShift model enabled consistency and continuity of care, and effective management of patient symptoms and caregiver respite. Conclusion: More than a technology solution, the eShift model of care was viewed as transforming home care practice and an innovative way to resolve the shortage of palliative care nurses within home care.Keywords: palliative home care, health information technology, patient-centred care, interprofessional health care team
Procedia PDF Downloads 42224719 Generating Insights from Data Using a Hybrid Approach
Authors: Allmin Susaiyah, Aki Härmä, Milan Petković
Abstract:
Automatic generation of insights from data using insight mining systems (IMS) is useful in many applications, such as personal health tracking, patient monitoring, and business process management. Existing IMS face challenges in controlling insight extraction, scaling to large databases, and generalising to unseen domains. In this work, we propose a hybrid approach consisting of rule-based and neural components for generating insights from data while overcoming the aforementioned challenges. Firstly, a rule-based data 2CNL component is used to extract statistically significant insights from data and represent them in a controlled natural language (CNL). Secondly, a BERTSum-based CNL2NL component is used to convert these CNLs into natural language texts. We improve the model using task-specific and domain-specific fine-tuning. Our approach has been evaluated using statistical techniques and standard evaluation metrics. We overcame the aforementioned challenges and observed significant improvement with domain-specific fine-tuning.Keywords: data mining, insight mining, natural language generation, pre-trained language models
Procedia PDF Downloads 12624718 Review of K0-Factors and Related Nuclear Data of the Selected Radionuclides for Use in K0-NAA
Authors: Manh-Dung Ho, Van-Giap Pham, Van-Doanh Ho, Quang-Thien Tran, Tuan-Anh Tran
Abstract:
The k0-factors and related nuclear data, i.e. the Q0-factors and effective resonance energies (Ēr) of the selected radionuclides which are used in the k0-based neutron activation analysis (k0-NAA), were critically reviewed to be integrated in the “k0-DALAT” software. The k0- and Q0-factors of some short-lived radionuclides: 46mSc, 110Ag, 116m2In, 165mDy, and 183mW, were experimentally determined at the Dalat research reactor. The other radionuclides selected are: 20F, 36S, 49Ca, 60mCo, 60Co, 75Se, 77mSe, 86mRb, 115Cd, 115mIn, 131Ba, 134mCs, 134Cs, 153Gd, 153Sm, 159Gd, 170Tm, 177mYb, 192Ir, 197mHg, 239U and 239Np. The reviewed data as compared with the literature data were biased within 5.6-7.3% in which the experimental re-determined factors were within 6.1 and 7.3%. The NIST standard reference materials: Oyster Tissue (1566b), Montana II Soil (2711a) and Coal Fly Ash (1633b) were used to validate the new reviewed data showing that the new data gave an improved k0-NAA using the “k0-DALAT” software with a factor of 4.5-6.8% for the investigated radionuclides.Keywords: neutron activation analysis, k0-based method, k0 factor, Q0 factor, effective resonance energy
Procedia PDF Downloads 13024717 Optimizing Electric Vehicle Charging with Charging Data Analytics
Authors: Tayyibah Khanam, Mohammad Saad Alam, Sanchari Deb, Yasser Rafat
Abstract:
Electric vehicles are considered as viable replacements to gasoline cars since they help in reducing harmful emissions and stimulate power generation through renewable energy sources, hence contributing to sustainability. However, one of the significant obstacles in the mass deployment of electric vehicles is the charging time anxiety among users and, thus, the subsequent large waiting times for available chargers at charging stations. Data analytics, on the other hand, has revolutionized the decision-making tasks of management and operating systems since its arrival. In this paper, we attempt to optimize the choice of EV charging stations for users in their vicinity by minimizing the time taken to reach the charging stations and the waiting times for available chargers. Time taken to travel to the charging station is calculated by the Google Maps API and the waiting times are predicted by polynomial regression of the historical data stored. The proposed framework utilizes real-time data and historical data from all operating charging stations in the city and assists the user in finding the best suitable charging station for their current situation and can be implemented in a mobile phone application. The algorithm successfully predicts the most optimal choice of a charging station and the minimum required time for various sample data sets.Keywords: charging data, electric vehicles, machine learning, waiting times
Procedia PDF Downloads 20024716 Finding Data Envelopment Analysis Targets Using Multi-Objective Programming in DEA-R with Stochastic Data
Authors: R. Shamsi, F. Sharifi
Abstract:
In this paper, we obtain the projection of inefficient units in data envelopment analysis (DEA) in the case of stochastic inputs and outputs using the multi-objective programming (MOP) structure. In some problems, the inputs might be stochastic while the outputs are deterministic, and vice versa. In such cases, we propose a multi-objective DEA-R model because in some cases (e.g., when unnecessary and irrational weights by the BCC model reduce the efficiency score), an efficient decision-making unit (DMU) is introduced as inefficient by the BCC model, whereas the DMU is considered efficient by the DEA-R model. In some other cases, only the ratio of stochastic data may be available (e.g., the ratio of stochastic inputs to stochastic outputs). Thus, we provide a multi-objective DEA model without explicit outputs and prove that the input-oriented MOP DEA-R model in the invariable return to scale case can be replaced by the MOP-DEA model without explicit outputs in the variable return to scale and vice versa. Using the interactive methods for solving the proposed model yields a projection corresponding to the viewpoint of the DM and the analyst, which is nearer to reality and more practical. Finally, an application is provided.Keywords: DEA-R, multi-objective programming, stochastic data, data envelopment analysis
Procedia PDF Downloads 11124715 The Real Consignee: An Exploratory Study of the True Party who is Entitled to Receive Cargo under Bill of Lading
Authors: Mojtaba Eshraghi Arani
Abstract:
According to the international conventions for the carriage of goods by sea, the consignee is the person who is entitled to take delivery of the cargo from the carrier. Such a person is usually named in the relevant box of the bill of lading unless the latter is issued “To Order” or “To Bearer”. However, there are some cases in which the apparent consignee, as above, was not intended to take delivery of cargo, like the L/C issuing bank or the freight forwarder who are named as consignee only for the purpose of security or acceleration of transit process. In such cases as well as the BL which is issued “To Order”, the so-called “real consignee” can be found out in the “Notify Party” box. The dispute revolves around the choice between apparent consignee and real consignee for being entitled not only to take delivery of the cargo but also to sue the carrier for any damages or loss. While it is a generally accepted rule that only the apparent consignee shall be vested with such rights, some courts like France’s Cour de Cassation have declared that the “Notify Party”, as the real consignee, was entitled to sue the carrier and in some cases, the same court went far beyond and permitted the real consignee to take suit even where he was not mentioned on the BL as a “Notify Party”. The main argument behind such reasoning is that the real consignee is the person who suffered the loss and thus had a legitimate interest in bringing action; of course, the real consignee must prove that he incurred a loss. It is undeniable that the above-mentioned approach is contrary to the position of the international conventions on the express definition of consignee. However, international practice has permitted the use of BL in a different way to meet the business requirements of banks, freight forwarders, etc. Thus, the issue is one of striking a balance between the international conventions on the one hand and existing practices on the other hand. While the latest convention applicable for sea transportation, i.e., the Rotterdam Rules, dealt with the comparable issue of “shipper” and “documentary shipper”, it failed to cope with the matter being discussed. So a new study is required to propose the best solution for amending the current conventions for carriage of goods by sea. A qualitative method with the concept of interpretation of data collection has been used in this article. The source of the data is the analysis of domestic and international regulations and cases. It is argued in this manuscript that the judge is not allowed to recognize any one as real consignee, other than the person who is mentioned in the “Consingee” box unless the BL is issued “To Order” or “To Bearer”. Moreover, the contract of carriage is independent of the sale contract and thus, the consignee must be determined solely based on the facts of the BL itself, like “Notify Party” and not any other contract or document.Keywords: real consignee, cargo, delivery, to order, notify the party
Procedia PDF Downloads 8324714 Organic Agriculture in Pakistan: Opportunities, Challenges, and Future Directions
Authors: Sher Ali
Abstract:
Organic agriculture has gained significant momentum globally as a sustainable and environmentally friendly farming practice. In Pakistan, amidst growing concerns about food security, environmental degradation, and health issues related to conventional farming methods, the adoption of organic agriculture presents a promising pathway for agricultural development. This abstract aims to provide an overview of the status, opportunities, challenges, and future directions of organic agriculture in Pakistan. It delves into the current state of organic farming practices, including the extent of adoption, key crops cultivated, and the regulatory framework governing organic certification. Furthermore, the abstract discusses the unique opportunities that Pakistan offers for organic agriculture, such as its diverse agro-climatic zones, rich biodiversity, and traditional farming knowledge. It highlights successful initiatives and case studies that showcase the potential of organic farming to improve rural livelihoods, enhance food security, and promote sustainable agricultural practices. However, the abstract also addresses the challenges hindering the widespread adoption of organic agriculture in Pakistan, ranging from limited awareness and technical know-how among farmers to inadequate infrastructure and market linkages. It emphasizes the need for supportive policies, capacity-building programs, and investment in research and extension services to overcome these challenges and promote the growth of the organic agriculture sector. Lastly, the abstract outlines future directions and recommendations for advancing organic agriculture in Pakistan, including strategies for scaling up production, strengthening certification mechanisms, and fostering collaboration among stakeholders. By shedding light on the opportunities, challenges, and potential of organic agriculture in Pakistan, this abstract aims to contribute to the discourse on sustainable farming practices at the upcoming Agro Conference in the USA. It invites participants to engage in dialogue, share experiences, and explore avenues for collaboration toward promoting organic agriculture for a healthier, more resilient food system.Keywords: agriculture, challenges, organic, Pakistan
Procedia PDF Downloads 5724713 Comparison of Statistical Methods for Estimating Missing Precipitation Data in the River Subbasin Lenguazaque, Colombia
Authors: Miguel Cañon, Darwin Mena, Ivan Cabeza
Abstract:
In this work was compared and evaluated the applicability of statistical methods for the estimation of missing precipitations data in the basin of the river Lenguazaque located in the departments of Cundinamarca and Boyacá, Colombia. The methods used were the method of simple linear regression, distance rate, local averages, mean rates, correlation with nearly stations and multiple regression method. The analysis used to determine the effectiveness of the methods is performed by using three statistical tools, the correlation coefficient (r2), standard error of estimation and the test of agreement of Bland and Altmant. The analysis was performed using real rainfall values removed randomly in each of the seasons and then estimated using the methodologies mentioned to complete the missing data values. So it was determined that the methods with the highest performance and accuracy in the estimation of data according to conditions that were counted are the method of multiple regressions with three nearby stations and a random application scheme supported in the precipitation behavior of related data sets.Keywords: statistical comparison, precipitation data, river subbasin, Bland and Altmant
Procedia PDF Downloads 47024712 Hyperspectral Data Classification Algorithm Based on the Deep Belief and Self-Organizing Neural Network
Authors: Li Qingjian, Li Ke, He Chun, Huang Yong
Abstract:
In this paper, the method of combining the Pohl Seidman's deep belief network with the self-organizing neural network is proposed to classify the target. This method is mainly aimed at the high nonlinearity of the hyperspectral image, the high sample dimension and the difficulty in designing the classifier. The main feature of original data is extracted by deep belief network. In the process of extracting features, adding known labels samples to fine tune the network, enriching the main characteristics. Then, the extracted feature vectors are classified into the self-organizing neural network. This method can effectively reduce the dimensions of data in the spectrum dimension in the preservation of large amounts of raw data information, to solve the traditional clustering and the long training time when labeled samples less deep learning algorithm for training problems, improve the classification accuracy and robustness. Through the data simulation, the results show that the proposed network structure can get a higher classification precision in the case of a small number of known label samples.Keywords: DBN, SOM, pattern classification, hyperspectral, data compression
Procedia PDF Downloads 34424711 Next-Gen Solutions: How Generative AI Will Reshape Businesses
Authors: Aishwarya Rai
Abstract:
This study explores the transformative influence of generative AI on startups, businesses, and industries. We will explore how large businesses can benefit in the area of customer operations, where AI-powered chatbots can improve self-service and agent effectiveness, greatly increasing efficiency. In marketing and sales, generative AI could transform businesses by automating content development, data utilization, and personalization, resulting in a substantial increase in marketing and sales productivity. In software engineering-focused startups, generative AI can streamline activities, significantly impacting coding processes and work experiences. It can be extremely useful in product R&D for market analysis, virtual design, simulations, and test preparation, altering old workflows and increasing efficiency. Zooming into the retail and CPG industry, industry findings suggest a 1-2% increase in annual revenues, equating to $400 billion to $660 billion. By automating customer service, marketing, sales, and supply chain management, generative AI can streamline operations, optimizing personalized offerings and presenting itself as a disruptive force. While celebrating economic potential, we acknowledge challenges like external inference and adversarial attacks. Human involvement remains crucial for quality control and security in the era of generative AI-driven transformative innovation. This talk provides a comprehensive exploration of generative AI's pivotal role in reshaping businesses, recognizing its strategic impact on customer interactions, productivity, and operational efficiency.Keywords: generative AI, digital transformation, LLM, artificial intelligence, startups, businesses
Procedia PDF Downloads 8124710 Assessing Performance of Data Augmentation Techniques for a Convolutional Network Trained for Recognizing Humans in Drone Images
Authors: Masood Varshosaz, Kamyar Hasanpour
Abstract:
In recent years, we have seen growing interest in recognizing humans in drone images for post-disaster search and rescue operations. Deep learning algorithms have shown great promise in this area, but they often require large amounts of labeled data to train the models. To keep the data acquisition cost low, augmentation techniques can be used to create additional data from existing images. There are many techniques of such that can help generate variations of an original image to improve the performance of deep learning algorithms. While data augmentation is potentially assumed to improve the accuracy and robustness of the models, it is important to ensure that the performance gains are not outweighed by the additional computational cost or complexity of implementing the techniques. To this end, it is important to evaluate the impact of data augmentation on the performance of the deep learning models. In this paper, we evaluated the most currently available 2D data augmentation techniques on a standard convolutional network which was trained for recognizing humans in drone images. The techniques include rotation, scaling, random cropping, flipping, shifting, and their combination. The results showed that the augmented models perform 1-3% better compared to a base network. However, as the augmented images only contain the human parts already visible in the original images, a new data augmentation approach is needed to include the invisible parts of the human body. Thus, we suggest a new method that employs simulated 3D human models to generate new data for training the network.Keywords: human recognition, deep learning, drones, disaster mitigation
Procedia PDF Downloads 10124709 Insight Into Database Forensics
Authors: Enas K., Fatimah A., Abeer A., Ghadah A.
Abstract:
Database forensics is a specialized field of digital forensics that investigates and analyzes database systems to recover and evaluate data, particularly in cases of cyberattacks and data breaches. The increasing significance of securing data confidentiality, integrity, and availability has emphasized the need for robust forensic models to preserve data integrity and maintain the chain of evidence. Organizations rely on Database Forensic Investigation (DBFI) to protect critical data, maintain trust, and support legal actions in the event of breaches. To address the complexities of relational and non-relational databases, structured forensic frameworks and tools have been developed. These include the Three-Tier Database Forensic Model (TT-DF) for comprehensive investigations, blockchain-backed logging systems for enhanced evidence reliability, and the FORC tool for mobile SQLite database forensics. Such advancements facilitate data recovery, identify unauthorized access, and reconstruct events for legal proceedings. Practical demonstrations of these tools and frameworks further illustrate their real-world applicability, advancing the effectiveness of database forensics in mitigating modern cybersecurity threats.Keywords: database forensics, cybersecurity, SQLite forensics, digital forensics
Procedia PDF Downloads 924708 Emotional Artificial Intelligence and the Right to Privacy
Authors: Emine Akar
Abstract:
The majority of privacy-related regulation has traditionally focused on concepts that are perceived to be well-understood or easily describable, such as certain categories of data and personal information or images. In the past century, such regulation appeared reasonably suitable for its purposes. However, technologies such as AI, combined with ever-increasing capabilities to collect, process, and store “big data”, not only require calibration of these traditional understandings but may require re-thinking of entire categories of privacy law. In the presentation, it will be explained, against the background of various emerging technologies under the umbrella term “emotional artificial intelligence”, why modern privacy law will need to embrace human emotions as potentially private subject matter. This argument can be made on a jurisprudential level, given that human emotions can plausibly be accommodated within the various concepts that are traditionally regarded as the underlying foundation of privacy protection, such as, for example, dignity, autonomy, and liberal values. However, the practical reasons for regarding human emotions as potentially private subject matter are perhaps more important (and very likely more convincing from the perspective of regulators). In that respect, it should be regarded as alarming that, according to most projections, the usefulness of emotional data to governments and, particularly, private companies will not only lead to radically increased processing and analysing of such data but, concerningly, to an exponential growth in the collection of such data. In light of this, it is also necessity to discuss options for how regulators could address this emerging threat.Keywords: AI, privacy law, data protection, big data
Procedia PDF Downloads 9124707 Develop a Conceptual Data Model of Geotechnical Risk Assessment in Underground Coal Mining Using a Cloud-Based Machine Learning Platform
Authors: Reza Mohammadzadeh
Abstract:
The major challenges in geotechnical engineering in underground spaces arise from uncertainties and different probabilities. The collection, collation, and collaboration of existing data to incorporate them in analysis and design for given prospect evaluation would be a reliable, practical problem solving method under uncertainty. Machine learning (ML) is a subfield of artificial intelligence in statistical science which applies different techniques (e.g., Regression, neural networks, support vector machines, decision trees, random forests, genetic programming, etc.) on data to automatically learn and improve from them without being explicitly programmed and make decisions and predictions. In this paper, a conceptual database schema of geotechnical risks in underground coal mining based on a cloud system architecture has been designed. A new approach of risk assessment using a three-dimensional risk matrix supported by the level of knowledge (LoK) has been proposed in this model. Subsequently, the model workflow methodology stages have been described. In order to train data and LoK models deployment, an ML platform has been implemented. IBM Watson Studio, as a leading data science tool and data-driven cloud integration ML platform, is employed in this study. As a Use case, a data set of geotechnical hazards and risk assessment in underground coal mining were prepared to demonstrate the performance of the model, and accordingly, the results have been outlined.Keywords: data model, geotechnical risks, machine learning, underground coal mining
Procedia PDF Downloads 28024706 Classification of Poverty Level Data in Indonesia Using the Naïve Bayes Method
Authors: Anung Style Bukhori, Ani Dijah Rahajoe
Abstract:
Poverty poses a significant challenge in Indonesia, requiring an effective analytical approach to understand and address this issue. In this research, we applied the Naïve Bayes classification method to examine and classify poverty data in Indonesia. The main focus is on classifying data using RapidMiner, a powerful data analysis platform. The analysis process involves data splitting to train and test the classification model. First, we collected and prepared a poverty dataset that includes various factors such as education, employment, and health..The experimental results indicate that the Naïve Bayes classification model can provide accurate predictions regarding the risk of poverty. The use of RapidMiner in the analysis process offers flexibility and efficiency in evaluating the model's performance. The classification produces several values to serve as the standard for classifying poverty data in Indonesia using Naive Bayes. The accuracy result obtained is 40.26%, with a moderate recall result of 35.94%, a high recall result of 63.16%, and a low recall result of 38.03%. The precision for the moderate class is 58.97%, for the high class is 17.39%, and for the low class is 58.70%. These results can be seen from the graph below.Keywords: poverty, classification, naïve bayes, Indonesia
Procedia PDF Downloads 6524705 WAQF Financing Using WAQF Sukuk in Iran
Authors: Meysam Doaei, Mojtaba Kavand
Abstract:
WAQF as a part of Islamic social security system is developed in Islam. Traditional WAQF has some limitations which are resolved in WAQF Sukuk. In regard to acceptability of Islamic finance in the world, WAQF Sukuk also has been developing in Islamic countries. In this paper, concept of WAQF, traditional and modern WAQF financing are presented. Then, WAQF Sukuk, its application and its model in Iran are developed.Keywords: Al-mawqūfat development, traditional financing, modern financing, WAQF Sukuk
Procedia PDF Downloads 52824704 Urban Agriculture Potential and Challenges in Mid-Sized Cities: A Case Study of Neishabour, Iran
Authors: Mohammadreza Mojtahedi
Abstract:
Urban agriculture, in the face of burgeoning urban populations and unchecked urbanization, presents a promising avenue for sustainable economic, social, and environmental growth. This study, set against the backdrop of Neishabour, Iran, delves into the potential and challenges inherent in this domain. Utilizing a descriptive-analytical approach, field survey data were predominantly collated via questionnaires. The research rigor was upheld with the Delphi method affirming the validity and a Cronbach's alpha score exceeding 0.70, underscoring reliability. The study encompassed Neishabour's 2016 populace, pegged at 264,375, drawing a sample size of 384 via Cochran's formula. The findings spotlight Neishabour's pronounced agricultural prowess, as evidenced by a significance level under 0.05 and an average difference of 0.54. Engaging in urban agricultural ventures can notably elevate job quality, spur savings, bolster profitability, promote organic cultivation, and streamline production expenses. However, challenges, such as heightened land valuations for alternative uses, conflicting land engagements, security dilemmas, technical impediments, waning citizen interest, regulatory conundrums, and perceived upfront investment risks, were identified. A silver lining emerged with urban locales, especially streets and boulevards, securing average ratings of 3.90, marking them as prime contenders for urban agricultural endeavors.Keywords: urban agriculture, sustainable development, mid-sized cities, neishabour.
Procedia PDF Downloads 6624703 Review of Full Body Imaging and High-Resolution Automatic 3D Mapping Systems for Medical Application
Authors: Jurijs Salijevs, Katrina Bolocko
Abstract:
The integration of artificial intelligence and neural networks has significantly changed full-body imaging and high-resolution 3D mapping systems, and this paper reviews research in these areas. With an emphasis on their use in the early identification of melanoma and other disorders, the goal is to give a wide perspective on the current status and potential future of these medical imaging technologies. Authors also examine methodologies such as machine learning and deep learning, seeking to identify efficient procedures that enhance diagnostic capabilities through the analysis of 3D body scans. This work aims to encourage further research and technological development to harness the full potential of AI in disease diagnosis.Keywords: artificial intelligence, neural networks, 3D scan, body scan, 3D mapping system, healthcare
Procedia PDF Downloads 11024702 Collective Problem Solving: Tackling Obstacles and Unlocking Opportunities for Young People Not in Education, Employment, or Training
Authors: Kalimah Ibrahiim, Israa Elmousa
Abstract:
This study employed the world café method alongside semi-structured interviews within a 'conversation café' setting to engage stakeholders from the public health and primary care sectors. The objective was to collaboratively explore strategies to improve outcomes for young people not in education, employment, or training (NEET). The discussions were aimed at identifying the underlying causes of disparities faced by NEET individuals, exchanging experiences, and formulating community-driven solutions to bolster preventive efforts and shape policy initiatives. A thematic analysis of the qualitative data gathered emphasized the importance of community problem-solving through the exchange of ideas and reflective discussions. Healthcare professionals reflected on their potential roles, pinpointing a significant gap in understanding the specific needs of the NEET population and the unclear distribution of responsibilities among stakeholders. The results underscore the necessity for a unified approach in primary care and the fostering of multi-agency collaborations that focus on addressing social determinants of health. Such strategies are critical not only for the immediate improvement of health outcomes for NEET individuals but also for informing broader policy decisions that can have long-term benefits. Further research is ongoing, delving deeper into the unique challenges faced by this demographic and striving to develop more effective interventions. The study advocates for continued efforts to integrate insights from various sectors to create a more holistic and effective response to the needs of the NEET population, ensuring that future strategies are informed by a comprehensive understanding of their circumstances and challenges.Keywords: multi-agency working, primary care, public health, social inequalities
Procedia PDF Downloads 45