K-means segmentation

Hello. I have a grayscale image with a mole and skin which I want to segment with K-means algorithm.I want the mole pixels to be classified in class 1 and the skin pixels to be classified in class2. How can I do that? The code above works, but sometimes the mole is black and sometimes is white. I want this to be done with k-means segmentation, I know it can be done in other different ways.

NOTE:-

Matlabsolutions.com provide latest MatLab Homework Help,MatLab Assignment Help for students, engineers and researchers in Multiple Branches like ECE, EEE, CSE, Mechanical, Civil with 100% output.Matlab Code for B.E, B.Tech,M.E,M.Tech, Ph.D. Scholars with 100% privacy guaranteed. Get MATLAB projects with source code for your learning and research.

The class numbers that kmeans() assigns can vary from one run to the next because it uses random numbers. However you can renumber the class labels if you know something about the class, like you always want class 1 to be the darker class, and class 2 to be the lighter class. See demo Code .

]

% Demo to show how you can redefine the class numbers assigned by kmeans() to different numbers.
% In this demo, the original, arbitrary class numbers will be reassigned a new number
% according to how far the cluster centroid is from the origin.
clc; % Clear the command window.
close all; % Close all figures (except those of imtool.)
clearvars;
workspace; % Make sure the workspace panel is showing.
format long g;
format compact;
fontSize = 18;
%===========================================================================================================================================
% DEMO #1 : RELABEL ACCORDING TO DISTANCE FROM ORIGIN.
%===========================================================================================================================================
%-------------------------------------------------------------------------------------------------------------------------------------------
% FIRST CREATE SAMPLE DATA.
% Make up 4 clusters with 150 points each.
pointsPerCluster = 150;
spread = 0.03;
offsets = [0.3, 0.5, 0.7, 0.9];
% offsets = [0.62, 0.73, 0.84, 0.95];
xa = spread * randn(pointsPerCluster, 1) + offsets(1);
ya = spread * randn(pointsPerCluster, 1) + offsets(1);
xb = spread * randn(pointsPerCluster, 1) + offsets(2);
yb = spread * randn(pointsPerCluster, 1) + offsets(2);
xc = spread * randn(pointsPerCluster, 1) + offsets(3);
yc = spread * randn(pointsPerCluster, 1) + offsets(3);
xd = spread * randn(pointsPerCluster, 1) + offsets(4);
yd = spread * randn(pointsPerCluster, 1) + offsets(4);
x = [xa; xb; xc; xd];
y = [ya; yb; yc; yd];
xy = [x, y];
%-----------------------

SEE COMPLETE ANSWER CLICK THE LINK

--

--

--

Simple! That is me, a simple person. I am passionate about knowledge and reading. That’s why I have decided to write and share a bit of my life and thoughts to.

Love podcasts or audiobooks? Learn on the go with our new app.

Recommended from Medium

Monitoring Trends using the GitHub API

New Blog Post — Making R code over 100 times faster: An example

Generating Folium Maps based on user input for a Dash Layout

Types On The BEAM

Understanding The Different Types Of Cloud Computing

Handle your python libraries with mmp

Introduction to Object Oriented Programming with Python

Our approach on ‘Automated’ regression testing

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Technical Source

Technical Source

Simple! That is me, a simple person. I am passionate about knowledge and reading. That’s why I have decided to write and share a bit of my life and thoughts to.

More from Medium

Implementing Gradient Descent in Python

Rapidminer for Supervised Learning (Classification)

Feature Selection in TreeBagger

Classification (Heart Disease Prediction )