Forum MicMac

This forum is dedicated the the community of MicMac users


All times are UTC + 1 hour



Post new topic Reply to topic  [ 6 posts ] 
Author Message
Offline

Joined: Oct 2013
Posts: 16
Gender: None specified
Posted: 06 Mar 2018, 12:39 

Dear all,

I'm interested in evaluating the distribution of 2D key-points extracted in each image of my data-set.

I guess that key-point observations are stored in the Pastis folder, as .dat files. Is it possible to save their 2D coordinates in a txt file (as I normally do for tie points with the option ExpTxt=1)?

Thanks a lot for your help:)

best
isa


Top
  Profile 
 
Offline

Joined: Mar 2013
Posts: 238
Location: UMR MAP (3495CNRS/MCC)
Gender: None specified
Posted: 08 Mar 2018, 12:03 

Dear Isabella,

To my knowledge it's not possible to convert sift features like the ExpTxt option enable directly using a MicMac command or option.
Nevertheless I have a piece of code (in java) capable of decoding binary format.

Let me know if you are interested

Cheers
Anthony


Top
  Profile 
 
Offline

Joined: Oct 2013
Posts: 16
Gender: None specified
Posted: 09 Mar 2018, 10:03 

Ciao Anthony,

thanks for your answer.
Yes, if the code is available, I would be happy to try it.

Thanks again:)
Best
Isa


Top
  Profile 
 
Offline

Joined: Mar 2013
Posts: 238
Location: UMR MAP (3495CNRS/MCC)
Gender: None specified
Posted: 09 Mar 2018, 14:12 

Hello Isabella,

Here is the piece of code in Java concerning coding and decoding .dat.
It's a part of "plugin" for tie-points analysis and filtering made by students during a workshop,
they also analyse distribution, maybe it's exactly what you are looking for ?
Send me a PM if you are interested ;)

In the java script below you'll find the part concerning the code/decoder of .dat

Best
Anthony

Code:
import java.io.File;
import java.io.FileFilter;
import java.io.FileNotFoundException;
import java.io.FileOutputStream;
import java.io.FileWriter;
import java.io.IOException;
import java.nio.ByteBuffer;
import java.nio.ByteOrder;
import java.nio.file.Files;
import java.util.ArrayList;
import java.util.HashMap;

import org.apache.commons.io.FileUtils;


public class MicMacFileUtil {
   /**
    * MicMacFileUtil is the central database + file util
    *
    * FIELDS:
    *       - BDD_Images: MicMacImage per file name in Homol folder
    *       - BDD_PointsHomologues: List of linked PointHomologue per PointHomologue
    *       - pointsRemoved: points removed by the FILTER
    *       - pointsHomogenized: points removed by the SPARSER
    *       - pointsKept: points kept afer both FILTER & SPARSER
    *
    * METHODS:
    *       - Loaders: loadImages, loadPointHomos etc. -> create & resets the databases
    *       - backupHomol: backups the detected Homol folder.
    *                   Does not overide previous backups so that original Homol is conserved .
    *       - writeHomol: write new Homol folder after treatment.
    *       - file utils: getExtension, delete etc.
    */
   
   public static String pathProject = "./";
   public static String pathBackupHomol;
   public static boolean isReaderBinary = true;
   
   public static HashMap<String, MicMacImage> BDD_Images = new HashMap<String, MicMacImage>();
   public static HashMap<PointHomologue, ArrayList<PointHomologue>> BDD_PointsHomologues = new HashMap<>();
   
   public static HashMap<MicMacImage, ArrayList<PointHomologue>> pointsKept = new HashMap<>();
   public static HashMap<MicMacImage, ArrayList<PointHomologue>> pointsRemoved = new HashMap<>();
   public static HashMap<MicMacImage, ArrayList<PointHomologue>> pointsHomogenized = new HashMap<>();
   
   public static String getPathHomol() {
      return pathProject + "Homol/";
   }
   
   public static boolean loadImagesAndRefs() {
      return loadImages() && loadReferences();
   }
   
   public static boolean loadImages() {
      BDD_Images.clear();
      pointsRemoved.clear();
      pointsKept.clear();
      File[] dossiers = getFiles(getPathHomol());
      for (File f : dossiers)
         try {
            System.out.println("Loaded: "+MicMacImage.fromFile(f));
         } catch (FileNotFoundException | ExceptionAlreadyExistsInBDD e) {
            e.printStackTrace();
            return false;
         }
      return true; // Pas de problemes
   }
   
   public static boolean loadReferences() {
      for(MicMacImage img : BDD_Images.values())
         img.loadReferences();
      return true; // Pas de problemes
   }
   
   public static boolean loadPointHomo() throws IOException {
      BDD_PointsHomologues.clear();
      
      for(MicMacImage source : BDD_Images.values()) {
         for(MicMacImage comparaison : source.references) {
            for(Couple c : source.getCouplesWith(comparaison)) {
               PointHomologue ptSource = new PointHomologue(source, c.x1, c.y1);
               PointHomologue ptCible = new PointHomologue(comparaison, c.x2, c.y2);
               
               ptSource.addLien(ptCible);
               // pas la peine de faire ptCible.addLien(source), ca sera fait plus tard de toute maniere
               
            }
         }
      }
      
      return true; // Pas de problemes
   }
   
   
   public static void backupHomol() throws IOException {
      File folder_backup = new File(pathBackupHomol);
      if(folder_backup.listFiles() != null && folder_backup.listFiles().length > 0) { // La on veut pas trop reecrire
         System.out.println("Backup folder not empty, backup of homol cancelled.");
      }
      else {
         File folder_homol = new File(getPathHomol());
         Files.createDirectories(folder_backup.toPath());
         FileUtils.copyDirectory(folder_homol, folder_backup, true);   
         System.out.println("Backup of homol successful to path "+folder_backup.getAbsolutePath()+"!");
      }
   }
   
   public static void writeHomol() throws IOException {
      File dossier_homol = new File(getPathHomol());
      delete(dossier_homol);
      Files.createDirectories(dossier_homol.toPath());
      
      for(MicMacImage img : pointsKept.keySet()) {
         HashMap<MicMacImage, ArrayList<Double[]>> map = new HashMap<>();
         HashMap<MicMacImage, ArrayList<String>> map2 = new HashMap<>();
         for(MicMacImage im : img.references) {
            map.put(im, new ArrayList<>());
            map2.put(im, new ArrayList<>());
         }
         for(PointHomologue pt : pointsKept.get(img)) {
            for(PointHomologue pt2 : BDD_PointsHomologues.get(pt)) {
               Double[] ar = {pt.x, pt.y, pt2.x, pt2.y};
               map.get(pt2.imageRef).add(ar);
               map2.get(pt2.imageRef).add(pt.x +" "+ pt.y+ " "+ pt2.x + " "+ pt2.y+"\r\n");
            }
         }
         File dossier_img = new File(dossier_homol.getAbsolutePath() + "/Pastis"+img.getName());
         Files.createDirectories(dossier_img.toPath());
         for(MicMacImage im : img.references) {
            File ecriture_dat = new File(dossier_img.getAbsolutePath() + "/"+im.getName()+".dat");
            File ecriture_txt = new File(dossier_img.getAbsolutePath() + "/"+im.getName()+".txt");
            
            FileOutputStream fop = new FileOutputStream(ecriture_dat);
            if(!ecriture_dat.exists()) ecriture_dat.createNewFile();

/**
    * HERE START THE CODER AND DECODER OF .DAT
            
            int predictionSize = 4 + 4 + (4 + 8 + 4*8)*map.get(im).size();
            ByteBuffer buffer = ByteBuffer.allocate(predictionSize).order(ByteOrder.nativeOrder());
            buffer.putInt(2); // La dimension, 4 bytes
            buffer.putInt(map.get(im).size()); // Les nuplets, 4 bytes
            
            for(Double[] vals : map.get(im)) {
               buffer.putInt(2); // La dimension, 4 bytes
               buffer.putDouble(1); // L'échelle, 8 bytes
               for(int i =0; i<4; i++) // Total: 8*4 bytes
                  buffer.putDouble(vals[i]);
            }
/**
    * HERE STOP THE CODER AND DECODER OF .DAT
            
            fop.write(buffer.array());
            fop.flush();
            fop.close();
            
            FileWriter fw = new FileWriter(ecriture_txt);
            for(String line : map2.get(im))
               fw.write(line);
            fw.close();
            
         }
      }
   }


   public static File[] getFiles(File path) {
      return getFiles(path, null);
   }
   
   public static File[] getFiles(String path) {
      return getFiles(path, null);
   }
   public static File[] getFiles(String path, String... format) {
      return getFiles(new File(path), null);
   }

   public static File[] getFiles(File dir, String... formats) {
      if(formats != null && formats.length > 0) {
         FileFilter filtre = new FileFilter() {
            @Override
            public boolean accept(File pathname) {
               for(String s : formats)
                  if(pathname.getName().toLowerCase().endsWith(s))
                     return true;
               return false;
            }
         };
         
         return dir.listFiles(filtre);
      }
      return dir.listFiles();
   }

   public static String getExtension(File f) {
      String[] split = f.getName().split("\\.");
      return split[split.length - 1].toUpperCase();
   }

   public static void delete(File f) throws IOException {
      if (f.isDirectory()) {
         for (File c : f.listFiles())
            delete(c);
      }
      f.delete();
   }
   
   
}


Top
  Profile 
 
Offline

Joined: Apr 2017
Posts: 64
Gender: None specified
Posted: 15 Mar 2018, 13:12 

Actually, durnig Tapioca command, if you use ExpTxt=1 argument, then you will have a txt file with all the tie points extracted.
This txt file will be in the Pastis directory


Top
  Profile 
 
Offline

Joined: Mar 2013
Posts: 238
Location: UMR MAP (3495CNRS/MCC)
Gender: None specified
Posted: 19 Mar 2018, 10:05 

Thank you Matteo, but the request is about the SIFT features and not the matched key-points :)


Top
  Profile 
 

Who is online

Users browsing this forum: No registered users and 0 guests

Permissions of this forum:

You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
Post new topic Reply to topic  [ 6 posts ] 


cron
Créer un forum | © phpBB | Entraide & support | Forum gratuit